Mar 19 09:17:04.752946 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 19 09:17:05.465120 master-0 kubenswrapper[3979]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:17:05.465120 master-0 kubenswrapper[3979]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 09:17:05.465120 master-0 kubenswrapper[3979]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:17:05.465120 master-0 kubenswrapper[3979]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:17:05.465120 master-0 kubenswrapper[3979]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 09:17:05.465120 master-0 kubenswrapper[3979]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:17:05.466967 master-0 kubenswrapper[3979]: I0319 09:17:05.466696 3979 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 09:17:05.473265 master-0 kubenswrapper[3979]: W0319 09:17:05.473186 3979 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:17:05.473265 master-0 kubenswrapper[3979]: W0319 09:17:05.473247 3979 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:17:05.473265 master-0 kubenswrapper[3979]: W0319 09:17:05.473258 3979 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:17:05.473265 master-0 kubenswrapper[3979]: W0319 09:17:05.473272 3979 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:17:05.473265 master-0 kubenswrapper[3979]: W0319 09:17:05.473285 3979 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:17:05.473265 master-0 kubenswrapper[3979]: W0319 09:17:05.473295 3979 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473305 3979 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473316 3979 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473326 3979 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473336 3979 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473345 3979 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473353 3979 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473361 3979 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473370 3979 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473379 3979 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473387 3979 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473395 3979 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473403 3979 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473412 3979 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473424 3979 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473441 3979 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473451 3979 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473460 3979 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473469 3979 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:17:05.473551 master-0 kubenswrapper[3979]: W0319 09:17:05.473477 3979 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473486 3979 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473494 3979 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473503 3979 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473511 3979 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473520 3979 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473554 3979 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473565 3979 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473574 3979 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473583 3979 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473593 3979 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473602 3979 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473610 3979 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473618 3979 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473627 3979 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473638 3979 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473649 3979 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473659 3979 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473668 3979 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:17:05.474191 master-0 kubenswrapper[3979]: W0319 09:17:05.473678 3979 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473688 3979 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473700 3979 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473710 3979 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473719 3979 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473730 3979 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473741 3979 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473752 3979 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473761 3979 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473769 3979 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473778 3979 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473786 3979 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473795 3979 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473804 3979 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473812 3979 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473821 3979 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473830 3979 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473838 3979 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473847 3979 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473855 3979 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:17:05.474785 master-0 kubenswrapper[3979]: W0319 09:17:05.473864 3979 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: W0319 09:17:05.473872 3979 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: W0319 09:17:05.473881 3979 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: W0319 09:17:05.473889 3979 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: W0319 09:17:05.473897 3979 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: W0319 09:17:05.473906 3979 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: W0319 09:17:05.473914 3979 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: W0319 09:17:05.473923 3979 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: W0319 09:17:05.473931 3979 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474128 3979 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474146 3979 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474161 3979 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474174 3979 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474189 3979 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474200 3979 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474213 3979 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474226 3979 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474237 3979 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474248 3979 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474259 3979 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474269 3979 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474279 3979 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 09:17:05.475591 master-0 kubenswrapper[3979]: I0319 09:17:05.474289 3979 flags.go:64] FLAG: --cgroup-root="" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474299 3979 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474308 3979 flags.go:64] FLAG: --client-ca-file="" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474318 3979 flags.go:64] FLAG: --cloud-config="" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474327 3979 flags.go:64] FLAG: --cloud-provider="" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474338 3979 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474352 3979 flags.go:64] FLAG: --cluster-domain="" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474361 3979 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474371 3979 flags.go:64] FLAG: --config-dir="" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474381 3979 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474392 3979 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474404 3979 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474414 3979 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474424 3979 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474434 3979 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474444 3979 flags.go:64] FLAG: --contention-profiling="false" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474454 3979 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474465 3979 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474475 3979 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474485 3979 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474496 3979 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474506 3979 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474516 3979 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474553 3979 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474564 3979 flags.go:64] FLAG: --enable-server="true" Mar 19 09:17:05.476282 master-0 kubenswrapper[3979]: I0319 09:17:05.474573 3979 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474585 3979 flags.go:64] FLAG: --event-burst="100" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474597 3979 flags.go:64] FLAG: --event-qps="50" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474607 3979 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474617 3979 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474626 3979 flags.go:64] FLAG: --eviction-hard="" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474638 3979 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474648 3979 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474658 3979 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474668 3979 flags.go:64] FLAG: --eviction-soft="" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474678 3979 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474687 3979 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474697 3979 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474708 3979 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474717 3979 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474727 3979 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474736 3979 flags.go:64] FLAG: --feature-gates="" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474748 3979 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474759 3979 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474769 3979 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474779 3979 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474790 3979 flags.go:64] FLAG: --healthz-port="10248" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474801 3979 flags.go:64] FLAG: --help="false" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474811 3979 flags.go:64] FLAG: --hostname-override="" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474820 3979 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474830 3979 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 09:17:05.477047 master-0 kubenswrapper[3979]: I0319 09:17:05.474840 3979 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474850 3979 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474860 3979 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474870 3979 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474879 3979 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474888 3979 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474898 3979 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474907 3979 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474921 3979 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474931 3979 flags.go:64] FLAG: --kube-reserved="" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474940 3979 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474950 3979 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474960 3979 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474970 3979 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474979 3979 flags.go:64] FLAG: --lock-file="" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.474988 3979 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.475004 3979 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.475014 3979 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.475030 3979 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.475039 3979 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.475050 3979 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.475059 3979 flags.go:64] FLAG: --logging-format="text" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.475069 3979 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.475080 3979 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.475089 3979 flags.go:64] FLAG: --manifest-url="" Mar 19 09:17:05.477888 master-0 kubenswrapper[3979]: I0319 09:17:05.475098 3979 flags.go:64] FLAG: --manifest-url-header="" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475111 3979 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475121 3979 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475133 3979 flags.go:64] FLAG: --max-pods="110" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475143 3979 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475154 3979 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475164 3979 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475174 3979 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475185 3979 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475194 3979 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475204 3979 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475231 3979 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475241 3979 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475251 3979 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475283 3979 flags.go:64] FLAG: --pod-cidr="" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475293 3979 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475308 3979 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475318 3979 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475328 3979 flags.go:64] FLAG: --pods-per-core="0" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475339 3979 flags.go:64] FLAG: --port="10250" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475349 3979 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475359 3979 flags.go:64] FLAG: --provider-id="" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475374 3979 flags.go:64] FLAG: --qos-reserved="" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475384 3979 flags.go:64] FLAG: --read-only-port="10255" Mar 19 09:17:05.478776 master-0 kubenswrapper[3979]: I0319 09:17:05.475398 3979 flags.go:64] FLAG: --register-node="true" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475407 3979 flags.go:64] FLAG: --register-schedulable="true" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475417 3979 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475433 3979 flags.go:64] FLAG: --registry-burst="10" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475483 3979 flags.go:64] FLAG: --registry-qps="5" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475494 3979 flags.go:64] FLAG: --reserved-cpus="" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475503 3979 flags.go:64] FLAG: --reserved-memory="" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475516 3979 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475553 3979 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475563 3979 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475573 3979 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475583 3979 flags.go:64] FLAG: --runonce="false" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475593 3979 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475604 3979 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475614 3979 flags.go:64] FLAG: --seccomp-default="false" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475624 3979 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475634 3979 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475645 3979 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475654 3979 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475664 3979 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475674 3979 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475684 3979 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475694 3979 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475703 3979 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475713 3979 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 09:17:05.479578 master-0 kubenswrapper[3979]: I0319 09:17:05.475723 3979 flags.go:64] FLAG: --system-cgroups="" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475733 3979 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475749 3979 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475758 3979 flags.go:64] FLAG: --tls-cert-file="" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475767 3979 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475780 3979 flags.go:64] FLAG: --tls-min-version="" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475790 3979 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475804 3979 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475814 3979 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475823 3979 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475833 3979 flags.go:64] FLAG: --v="2" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475845 3979 flags.go:64] FLAG: --version="false" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475857 3979 flags.go:64] FLAG: --vmodule="" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475869 3979 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: I0319 09:17:05.475879 3979 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: W0319 09:17:05.476101 3979 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: W0319 09:17:05.476111 3979 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: W0319 09:17:05.476121 3979 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: W0319 09:17:05.476130 3979 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: W0319 09:17:05.476139 3979 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: W0319 09:17:05.476148 3979 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: W0319 09:17:05.476157 3979 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: W0319 09:17:05.476165 3979 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:17:05.480332 master-0 kubenswrapper[3979]: W0319 09:17:05.476174 3979 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476183 3979 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476192 3979 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476200 3979 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476211 3979 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476222 3979 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476231 3979 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476240 3979 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476248 3979 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476257 3979 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476265 3979 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476274 3979 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476284 3979 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476294 3979 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476302 3979 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476311 3979 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476325 3979 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476334 3979 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476342 3979 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476352 3979 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:17:05.481105 master-0 kubenswrapper[3979]: W0319 09:17:05.476363 3979 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476374 3979 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476385 3979 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476395 3979 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476404 3979 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476413 3979 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476423 3979 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476432 3979 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476440 3979 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476449 3979 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476457 3979 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476466 3979 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476475 3979 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476483 3979 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476492 3979 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476501 3979 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476509 3979 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476518 3979 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476554 3979 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476564 3979 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:17:05.481700 master-0 kubenswrapper[3979]: W0319 09:17:05.476572 3979 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476581 3979 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476590 3979 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476602 3979 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476613 3979 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476623 3979 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476632 3979 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476643 3979 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476659 3979 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476668 3979 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476679 3979 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476687 3979 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476697 3979 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476706 3979 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476716 3979 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476726 3979 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476736 3979 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476746 3979 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476755 3979 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:17:05.482279 master-0 kubenswrapper[3979]: W0319 09:17:05.476764 3979 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:17:05.482854 master-0 kubenswrapper[3979]: W0319 09:17:05.476772 3979 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:17:05.482854 master-0 kubenswrapper[3979]: W0319 09:17:05.476781 3979 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:17:05.482854 master-0 kubenswrapper[3979]: W0319 09:17:05.476790 3979 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:17:05.482854 master-0 kubenswrapper[3979]: W0319 09:17:05.476798 3979 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:17:05.482854 master-0 kubenswrapper[3979]: I0319 09:17:05.476823 3979 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:17:05.488412 master-0 kubenswrapper[3979]: I0319 09:17:05.488335 3979 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 19 09:17:05.488412 master-0 kubenswrapper[3979]: I0319 09:17:05.488390 3979 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 09:17:05.488569 master-0 kubenswrapper[3979]: W0319 09:17:05.488485 3979 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:17:05.488569 master-0 kubenswrapper[3979]: W0319 09:17:05.488494 3979 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:17:05.488569 master-0 kubenswrapper[3979]: W0319 09:17:05.488500 3979 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:17:05.488569 master-0 kubenswrapper[3979]: W0319 09:17:05.488507 3979 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:17:05.488569 master-0 kubenswrapper[3979]: W0319 09:17:05.488514 3979 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:17:05.488569 master-0 kubenswrapper[3979]: W0319 09:17:05.488521 3979 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488578 3979 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488592 3979 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488601 3979 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488607 3979 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488613 3979 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488619 3979 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488624 3979 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488630 3979 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488635 3979 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488641 3979 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488646 3979 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488651 3979 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488657 3979 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488663 3979 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488668 3979 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488673 3979 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488678 3979 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488683 3979 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488689 3979 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:17:05.488764 master-0 kubenswrapper[3979]: W0319 09:17:05.488694 3979 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488700 3979 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488707 3979 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488712 3979 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488718 3979 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488723 3979 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488728 3979 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488733 3979 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488739 3979 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488747 3979 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488753 3979 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488758 3979 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488765 3979 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488771 3979 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488777 3979 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488783 3979 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488788 3979 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488797 3979 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488805 3979 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488811 3979 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:17:05.489358 master-0 kubenswrapper[3979]: W0319 09:17:05.488818 3979 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488824 3979 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488829 3979 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488835 3979 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488841 3979 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488848 3979 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488857 3979 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488864 3979 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488871 3979 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488877 3979 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488883 3979 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488888 3979 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488893 3979 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488898 3979 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488904 3979 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488909 3979 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488915 3979 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488920 3979 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488925 3979 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:17:05.490294 master-0 kubenswrapper[3979]: W0319 09:17:05.488930 3979 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.488937 3979 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.488945 3979 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.488951 3979 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.488957 3979 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.488964 3979 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.488970 3979 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.488976 3979 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: I0319 09:17:05.488985 3979 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.489144 3979 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.489153 3979 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.489159 3979 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.489165 3979 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.489171 3979 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:17:05.490906 master-0 kubenswrapper[3979]: W0319 09:17:05.489177 3979 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489182 3979 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489187 3979 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489193 3979 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489198 3979 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489203 3979 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489208 3979 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489214 3979 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489219 3979 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489224 3979 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489229 3979 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489234 3979 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489240 3979 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489245 3979 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489250 3979 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489256 3979 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489261 3979 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489266 3979 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489273 3979 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489281 3979 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:17:05.491358 master-0 kubenswrapper[3979]: W0319 09:17:05.489288 3979 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489294 3979 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489299 3979 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489305 3979 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489310 3979 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489316 3979 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489321 3979 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489326 3979 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489332 3979 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489337 3979 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489343 3979 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489350 3979 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489356 3979 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489361 3979 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489367 3979 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489373 3979 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489378 3979 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489384 3979 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489390 3979 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489397 3979 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:17:05.492168 master-0 kubenswrapper[3979]: W0319 09:17:05.489406 3979 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489413 3979 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489419 3979 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489425 3979 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489430 3979 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489436 3979 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489442 3979 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489447 3979 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489452 3979 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489458 3979 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489463 3979 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489468 3979 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489474 3979 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489479 3979 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489485 3979 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489490 3979 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489495 3979 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489501 3979 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489506 3979 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489512 3979 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:17:05.493010 master-0 kubenswrapper[3979]: W0319 09:17:05.489517 3979 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:17:05.493660 master-0 kubenswrapper[3979]: W0319 09:17:05.489548 3979 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:17:05.493660 master-0 kubenswrapper[3979]: W0319 09:17:05.489558 3979 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:17:05.493660 master-0 kubenswrapper[3979]: W0319 09:17:05.489567 3979 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:17:05.493660 master-0 kubenswrapper[3979]: W0319 09:17:05.489576 3979 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:17:05.493660 master-0 kubenswrapper[3979]: W0319 09:17:05.489582 3979 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:17:05.493660 master-0 kubenswrapper[3979]: W0319 09:17:05.489588 3979 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:17:05.493660 master-0 kubenswrapper[3979]: I0319 09:17:05.489596 3979 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:17:05.493660 master-0 kubenswrapper[3979]: I0319 09:17:05.490695 3979 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 09:17:05.494013 master-0 kubenswrapper[3979]: I0319 09:17:05.493973 3979 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 19 09:17:05.494938 master-0 kubenswrapper[3979]: I0319 09:17:05.494886 3979 server.go:997] "Starting client certificate rotation" Mar 19 09:17:05.494938 master-0 kubenswrapper[3979]: I0319 09:17:05.494923 3979 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 09:17:05.495202 master-0 kubenswrapper[3979]: I0319 09:17:05.495123 3979 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:17:05.525293 master-0 kubenswrapper[3979]: I0319 09:17:05.525209 3979 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:17:05.531044 master-0 kubenswrapper[3979]: I0319 09:17:05.531007 3979 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:17:05.534548 master-0 kubenswrapper[3979]: E0319 09:17:05.534454 3979 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:05.550261 master-0 kubenswrapper[3979]: I0319 09:17:05.550188 3979 log.go:25] "Validated CRI v1 runtime API" Mar 19 09:17:05.559756 master-0 kubenswrapper[3979]: I0319 09:17:05.559678 3979 log.go:25] "Validated CRI v1 image API" Mar 19 09:17:05.562488 master-0 kubenswrapper[3979]: I0319 09:17:05.562437 3979 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 09:17:05.566163 master-0 kubenswrapper[3979]: I0319 09:17:05.566062 3979 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 aae93335-158a-444f-870b-34679824b626:/dev/vda3] Mar 19 09:17:05.566163 master-0 kubenswrapper[3979]: I0319 09:17:05.566144 3979 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Mar 19 09:17:05.595861 master-0 kubenswrapper[3979]: I0319 09:17:05.595241 3979 manager.go:217] Machine: {Timestamp:2026-03-19 09:17:05.593637026 +0000 UTC m=+0.636624684 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:dab19efcf33543febdac139f3c303589 SystemUUID:dab19efc-f335-43fe-bdac-139f3c303589 BootID:870de220-908c-4452-8349-8f04a86857c3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:5a:31:1f Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:3a:39:e3:03:14:35 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 09:17:05.596090 master-0 kubenswrapper[3979]: I0319 09:17:05.595906 3979 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 09:17:05.596231 master-0 kubenswrapper[3979]: I0319 09:17:05.596171 3979 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 09:17:05.597285 master-0 kubenswrapper[3979]: I0319 09:17:05.597231 3979 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 09:17:05.597726 master-0 kubenswrapper[3979]: I0319 09:17:05.597642 3979 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 09:17:05.598149 master-0 kubenswrapper[3979]: I0319 09:17:05.597727 3979 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 09:17:05.598206 master-0 kubenswrapper[3979]: I0319 09:17:05.598190 3979 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 09:17:05.598243 master-0 kubenswrapper[3979]: I0319 09:17:05.598213 3979 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 09:17:05.598362 master-0 kubenswrapper[3979]: I0319 09:17:05.598329 3979 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:17:05.598416 master-0 kubenswrapper[3979]: I0319 09:17:05.598380 3979 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:17:05.598651 master-0 kubenswrapper[3979]: I0319 09:17:05.598619 3979 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:17:05.598843 master-0 kubenswrapper[3979]: I0319 09:17:05.598780 3979 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 09:17:05.603334 master-0 kubenswrapper[3979]: I0319 09:17:05.603281 3979 kubelet.go:418] "Attempting to sync node with API server" Mar 19 09:17:05.603690 master-0 kubenswrapper[3979]: I0319 09:17:05.603369 3979 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 09:17:05.603690 master-0 kubenswrapper[3979]: I0319 09:17:05.603414 3979 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 09:17:05.603690 master-0 kubenswrapper[3979]: I0319 09:17:05.603434 3979 kubelet.go:324] "Adding apiserver pod source" Mar 19 09:17:05.603690 master-0 kubenswrapper[3979]: I0319 09:17:05.603452 3979 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 09:17:05.609017 master-0 kubenswrapper[3979]: I0319 09:17:05.608963 3979 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 19 09:17:05.610998 master-0 kubenswrapper[3979]: W0319 09:17:05.610842 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:05.611060 master-0 kubenswrapper[3979]: W0319 09:17:05.610950 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:05.611100 master-0 kubenswrapper[3979]: E0319 09:17:05.611069 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:05.611100 master-0 kubenswrapper[3979]: E0319 09:17:05.611078 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:05.614326 master-0 kubenswrapper[3979]: I0319 09:17:05.614254 3979 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 09:17:05.614924 master-0 kubenswrapper[3979]: I0319 09:17:05.614874 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 09:17:05.615018 master-0 kubenswrapper[3979]: I0319 09:17:05.614937 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 09:17:05.615018 master-0 kubenswrapper[3979]: I0319 09:17:05.614993 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 09:17:05.615018 master-0 kubenswrapper[3979]: I0319 09:17:05.615011 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 09:17:05.615123 master-0 kubenswrapper[3979]: I0319 09:17:05.615028 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 09:17:05.615123 master-0 kubenswrapper[3979]: I0319 09:17:05.615058 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 09:17:05.615123 master-0 kubenswrapper[3979]: I0319 09:17:05.615077 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 09:17:05.615123 master-0 kubenswrapper[3979]: I0319 09:17:05.615095 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 09:17:05.615123 master-0 kubenswrapper[3979]: I0319 09:17:05.615119 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 09:17:05.615294 master-0 kubenswrapper[3979]: I0319 09:17:05.615137 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 09:17:05.615294 master-0 kubenswrapper[3979]: I0319 09:17:05.615166 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 09:17:05.615294 master-0 kubenswrapper[3979]: I0319 09:17:05.615196 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 09:17:05.616590 master-0 kubenswrapper[3979]: I0319 09:17:05.616517 3979 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 09:17:05.617426 master-0 kubenswrapper[3979]: I0319 09:17:05.617388 3979 server.go:1280] "Started kubelet" Mar 19 09:17:05.617880 master-0 kubenswrapper[3979]: I0319 09:17:05.617751 3979 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 09:17:05.618432 master-0 kubenswrapper[3979]: I0319 09:17:05.618335 3979 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 09:17:05.618502 master-0 kubenswrapper[3979]: I0319 09:17:05.618453 3979 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 19 09:17:05.618921 master-0 kubenswrapper[3979]: I0319 09:17:05.618883 3979 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 09:17:05.619473 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 19 09:17:05.621185 master-0 kubenswrapper[3979]: I0319 09:17:05.621115 3979 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 09:17:05.621185 master-0 kubenswrapper[3979]: I0319 09:17:05.621175 3979 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 09:17:05.621513 master-0 kubenswrapper[3979]: I0319 09:17:05.621489 3979 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 09:17:05.621649 master-0 kubenswrapper[3979]: I0319 09:17:05.621629 3979 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 09:17:05.621910 master-0 kubenswrapper[3979]: I0319 09:17:05.621888 3979 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 19 09:17:05.622151 master-0 kubenswrapper[3979]: I0319 09:17:05.622131 3979 reconstruct.go:97] "Volume reconstruction finished" Mar 19 09:17:05.622279 master-0 kubenswrapper[3979]: I0319 09:17:05.622261 3979 reconciler.go:26] "Reconciler: start to sync state" Mar 19 09:17:05.622403 master-0 kubenswrapper[3979]: I0319 09:17:05.622374 3979 server.go:449] "Adding debug handlers to kubelet server" Mar 19 09:17:05.622886 master-0 kubenswrapper[3979]: I0319 09:17:05.622854 3979 factory.go:55] Registering systemd factory Mar 19 09:17:05.623015 master-0 kubenswrapper[3979]: I0319 09:17:05.622995 3979 factory.go:221] Registration of the systemd container factory successfully Mar 19 09:17:05.623739 master-0 kubenswrapper[3979]: E0319 09:17:05.623714 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:05.624110 master-0 kubenswrapper[3979]: I0319 09:17:05.624088 3979 factory.go:153] Registering CRI-O factory Mar 19 09:17:05.624256 master-0 kubenswrapper[3979]: I0319 09:17:05.624236 3979 factory.go:221] Registration of the crio container factory successfully Mar 19 09:17:05.624427 master-0 kubenswrapper[3979]: I0319 09:17:05.624406 3979 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 09:17:05.624576 master-0 kubenswrapper[3979]: I0319 09:17:05.624558 3979 factory.go:103] Registering Raw factory Mar 19 09:17:05.624670 master-0 kubenswrapper[3979]: I0319 09:17:05.624658 3979 manager.go:1196] Started watching for new ooms in manager Mar 19 09:17:05.624974 master-0 kubenswrapper[3979]: E0319 09:17:05.623697 3979 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e336824c93511 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.617335569 +0000 UTC m=+0.660323177,LastTimestamp:2026-03-19 09:17:05.617335569 +0000 UTC m=+0.660323177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:05.625463 master-0 kubenswrapper[3979]: I0319 09:17:05.625442 3979 manager.go:319] Starting recovery of all containers Mar 19 09:17:05.628607 master-0 kubenswrapper[3979]: E0319 09:17:05.624228 3979 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 09:17:05.629437 master-0 kubenswrapper[3979]: I0319 09:17:05.629370 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:05.630575 master-0 kubenswrapper[3979]: W0319 09:17:05.630463 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:05.631696 master-0 kubenswrapper[3979]: E0319 09:17:05.631642 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:05.644818 master-0 kubenswrapper[3979]: E0319 09:17:05.644740 3979 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 19 09:17:05.651858 master-0 kubenswrapper[3979]: I0319 09:17:05.651826 3979 manager.go:324] Recovery completed Mar 19 09:17:05.663831 master-0 kubenswrapper[3979]: I0319 09:17:05.663789 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.665867 master-0 kubenswrapper[3979]: I0319 09:17:05.665807 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.666094 master-0 kubenswrapper[3979]: I0319 09:17:05.666055 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.666313 master-0 kubenswrapper[3979]: I0319 09:17:05.666286 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.667769 master-0 kubenswrapper[3979]: I0319 09:17:05.667742 3979 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 09:17:05.667945 master-0 kubenswrapper[3979]: I0319 09:17:05.667921 3979 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 09:17:05.668151 master-0 kubenswrapper[3979]: I0319 09:17:05.668131 3979 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:17:05.672789 master-0 kubenswrapper[3979]: I0319 09:17:05.672766 3979 policy_none.go:49] "None policy: Start" Mar 19 09:17:05.673994 master-0 kubenswrapper[3979]: I0319 09:17:05.673941 3979 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 09:17:05.673994 master-0 kubenswrapper[3979]: I0319 09:17:05.673985 3979 state_mem.go:35] "Initializing new in-memory state store" Mar 19 09:17:05.724265 master-0 kubenswrapper[3979]: E0319 09:17:05.724202 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:05.745027 master-0 kubenswrapper[3979]: I0319 09:17:05.744746 3979 manager.go:334] "Starting Device Plugin manager" Mar 19 09:17:05.745027 master-0 kubenswrapper[3979]: I0319 09:17:05.744906 3979 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 09:17:05.745027 master-0 kubenswrapper[3979]: I0319 09:17:05.744937 3979 server.go:79] "Starting device plugin registration server" Mar 19 09:17:05.745834 master-0 kubenswrapper[3979]: I0319 09:17:05.745782 3979 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 09:17:05.745959 master-0 kubenswrapper[3979]: I0319 09:17:05.745825 3979 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 09:17:05.746113 master-0 kubenswrapper[3979]: I0319 09:17:05.746049 3979 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 09:17:05.746270 master-0 kubenswrapper[3979]: I0319 09:17:05.746219 3979 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 09:17:05.746270 master-0 kubenswrapper[3979]: I0319 09:17:05.746246 3979 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 09:17:05.748304 master-0 kubenswrapper[3979]: E0319 09:17:05.748236 3979 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:17:05.778612 master-0 kubenswrapper[3979]: I0319 09:17:05.778514 3979 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: I0319 09:17:05.781728 3979 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: I0319 09:17:05.781787 3979 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: I0319 09:17:05.781819 3979 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: E0319 09:17:05.781971 3979 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: W0319 09:17:05.784007 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: E0319 09:17:05.784057 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: E0319 09:17:05.830238 3979 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: I0319 09:17:05.846298 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: I0319 09:17:05.847995 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: I0319 09:17:05.848056 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: I0319 09:17:05.848077 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: I0319 09:17:05.848130 3979 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:05.868035 master-0 kubenswrapper[3979]: E0319 09:17:05.849383 3979 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:05.882736 master-0 kubenswrapper[3979]: I0319 09:17:05.882621 3979 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 19 09:17:05.882865 master-0 kubenswrapper[3979]: I0319 09:17:05.882794 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.884383 master-0 kubenswrapper[3979]: I0319 09:17:05.884331 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.884476 master-0 kubenswrapper[3979]: I0319 09:17:05.884384 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.884476 master-0 kubenswrapper[3979]: I0319 09:17:05.884402 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.884680 master-0 kubenswrapper[3979]: I0319 09:17:05.884615 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.884972 master-0 kubenswrapper[3979]: I0319 09:17:05.884914 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:05.885057 master-0 kubenswrapper[3979]: I0319 09:17:05.884981 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.886091 master-0 kubenswrapper[3979]: I0319 09:17:05.886044 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.886091 master-0 kubenswrapper[3979]: I0319 09:17:05.886077 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.886242 master-0 kubenswrapper[3979]: I0319 09:17:05.886141 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.886242 master-0 kubenswrapper[3979]: I0319 09:17:05.886162 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.886242 master-0 kubenswrapper[3979]: I0319 09:17:05.886085 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.886419 master-0 kubenswrapper[3979]: I0319 09:17:05.886262 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.886419 master-0 kubenswrapper[3979]: I0319 09:17:05.886320 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.886571 master-0 kubenswrapper[3979]: I0319 09:17:05.886501 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:05.886571 master-0 kubenswrapper[3979]: I0319 09:17:05.886553 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.887414 master-0 kubenswrapper[3979]: I0319 09:17:05.887368 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.887414 master-0 kubenswrapper[3979]: I0319 09:17:05.887387 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.887414 master-0 kubenswrapper[3979]: I0319 09:17:05.887408 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.887414 master-0 kubenswrapper[3979]: I0319 09:17:05.887395 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.887414 master-0 kubenswrapper[3979]: I0319 09:17:05.887419 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.887798 master-0 kubenswrapper[3979]: I0319 09:17:05.887422 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.887798 master-0 kubenswrapper[3979]: I0319 09:17:05.887633 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.887945 master-0 kubenswrapper[3979]: I0319 09:17:05.887901 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:05.888041 master-0 kubenswrapper[3979]: I0319 09:17:05.888003 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.888339 master-0 kubenswrapper[3979]: I0319 09:17:05.888299 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.888339 master-0 kubenswrapper[3979]: I0319 09:17:05.888333 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.888470 master-0 kubenswrapper[3979]: I0319 09:17:05.888345 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.888470 master-0 kubenswrapper[3979]: I0319 09:17:05.888439 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.888715 master-0 kubenswrapper[3979]: I0319 09:17:05.888660 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:05.888715 master-0 kubenswrapper[3979]: I0319 09:17:05.888716 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.889202 master-0 kubenswrapper[3979]: I0319 09:17:05.889163 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.889202 master-0 kubenswrapper[3979]: I0319 09:17:05.889190 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.889202 master-0 kubenswrapper[3979]: I0319 09:17:05.889200 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.889202 master-0 kubenswrapper[3979]: I0319 09:17:05.889203 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.889434 master-0 kubenswrapper[3979]: I0319 09:17:05.889227 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.889434 master-0 kubenswrapper[3979]: I0319 09:17:05.889238 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.889622 master-0 kubenswrapper[3979]: I0319 09:17:05.889445 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:05.889622 master-0 kubenswrapper[3979]: I0319 09:17:05.889490 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.889842 master-0 kubenswrapper[3979]: I0319 09:17:05.889788 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.889842 master-0 kubenswrapper[3979]: I0319 09:17:05.889836 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.889972 master-0 kubenswrapper[3979]: I0319 09:17:05.889858 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.890356 master-0 kubenswrapper[3979]: I0319 09:17:05.890302 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.890433 master-0 kubenswrapper[3979]: I0319 09:17:05.890354 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.890433 master-0 kubenswrapper[3979]: I0319 09:17:05.890380 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.923928 master-0 kubenswrapper[3979]: I0319 09:17:05.923855 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:05.924132 master-0 kubenswrapper[3979]: I0319 09:17:05.923984 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:05.924132 master-0 kubenswrapper[3979]: I0319 09:17:05.924038 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:05.924132 master-0 kubenswrapper[3979]: I0319 09:17:05.924117 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:05.924430 master-0 kubenswrapper[3979]: I0319 09:17:05.924185 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:05.924430 master-0 kubenswrapper[3979]: I0319 09:17:05.924241 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:05.924430 master-0 kubenswrapper[3979]: I0319 09:17:05.924335 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:05.924430 master-0 kubenswrapper[3979]: I0319 09:17:05.924387 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:05.924847 master-0 kubenswrapper[3979]: I0319 09:17:05.924432 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:05.924847 master-0 kubenswrapper[3979]: I0319 09:17:05.924571 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:05.924847 master-0 kubenswrapper[3979]: I0319 09:17:05.924645 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:05.924847 master-0 kubenswrapper[3979]: I0319 09:17:05.924755 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:05.924847 master-0 kubenswrapper[3979]: I0319 09:17:05.924814 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:05.925237 master-0 kubenswrapper[3979]: I0319 09:17:05.924875 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:05.925237 master-0 kubenswrapper[3979]: I0319 09:17:05.924972 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:05.925237 master-0 kubenswrapper[3979]: I0319 09:17:05.925021 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:05.925237 master-0 kubenswrapper[3979]: I0319 09:17:05.925159 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:06.026052 master-0 kubenswrapper[3979]: I0319 09:17:06.025936 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:06.026052 master-0 kubenswrapper[3979]: I0319 09:17:06.026017 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:06.026052 master-0 kubenswrapper[3979]: I0319 09:17:06.026060 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:06.026408 master-0 kubenswrapper[3979]: I0319 09:17:06.026119 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:06.026408 master-0 kubenswrapper[3979]: I0319 09:17:06.026127 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:06.026408 master-0 kubenswrapper[3979]: I0319 09:17:06.026118 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:06.026408 master-0 kubenswrapper[3979]: I0319 09:17:06.026209 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.026408 master-0 kubenswrapper[3979]: I0319 09:17:06.026229 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.026408 master-0 kubenswrapper[3979]: I0319 09:17:06.026256 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.026408 master-0 kubenswrapper[3979]: I0319 09:17:06.026285 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:06.026408 master-0 kubenswrapper[3979]: I0319 09:17:06.026313 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.026408 master-0 kubenswrapper[3979]: I0319 09:17:06.026345 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:06.026408 master-0 kubenswrapper[3979]: I0319 09:17:06.026370 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.026408 master-0 kubenswrapper[3979]: I0319 09:17:06.026406 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026436 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026437 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026463 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026476 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026500 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026581 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026609 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026631 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026642 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026677 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026653 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026718 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026728 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026752 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026630 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026773 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:06.027043 master-0 kubenswrapper[3979]: I0319 09:17:06.026803 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:06.028055 master-0 kubenswrapper[3979]: I0319 09:17:06.026807 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:06.028055 master-0 kubenswrapper[3979]: I0319 09:17:06.026834 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:06.028055 master-0 kubenswrapper[3979]: I0319 09:17:06.026601 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:06.050213 master-0 kubenswrapper[3979]: I0319 09:17:06.050109 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:06.051458 master-0 kubenswrapper[3979]: I0319 09:17:06.051389 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:06.051458 master-0 kubenswrapper[3979]: I0319 09:17:06.051441 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:06.051713 master-0 kubenswrapper[3979]: I0319 09:17:06.051513 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:06.051713 master-0 kubenswrapper[3979]: I0319 09:17:06.051621 3979 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:06.052625 master-0 kubenswrapper[3979]: E0319 09:17:06.052558 3979 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:06.228071 master-0 kubenswrapper[3979]: I0319 09:17:06.227946 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:06.232007 master-0 kubenswrapper[3979]: E0319 09:17:06.231306 3979 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 09:17:06.235429 master-0 kubenswrapper[3979]: I0319 09:17:06.235273 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:06.255885 master-0 kubenswrapper[3979]: I0319 09:17:06.255745 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:06.279572 master-0 kubenswrapper[3979]: I0319 09:17:06.279425 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:06.290621 master-0 kubenswrapper[3979]: I0319 09:17:06.290564 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:06.453496 master-0 kubenswrapper[3979]: I0319 09:17:06.453410 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:06.455291 master-0 kubenswrapper[3979]: I0319 09:17:06.455231 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:06.455373 master-0 kubenswrapper[3979]: I0319 09:17:06.455322 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:06.455373 master-0 kubenswrapper[3979]: I0319 09:17:06.455347 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:06.455594 master-0 kubenswrapper[3979]: I0319 09:17:06.455475 3979 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:06.456887 master-0 kubenswrapper[3979]: E0319 09:17:06.456805 3979 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:06.512868 master-0 kubenswrapper[3979]: W0319 09:17:06.512619 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:06.512868 master-0 kubenswrapper[3979]: E0319 09:17:06.512760 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:06.631310 master-0 kubenswrapper[3979]: I0319 09:17:06.631200 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:06.900106 master-0 kubenswrapper[3979]: W0319 09:17:06.899673 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:06.900220 master-0 kubenswrapper[3979]: E0319 09:17:06.900116 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:06.955175 master-0 kubenswrapper[3979]: W0319 09:17:06.955013 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:06.955175 master-0 kubenswrapper[3979]: E0319 09:17:06.955164 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:06.956858 master-0 kubenswrapper[3979]: W0319 09:17:06.956756 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83737980b9ee109184b1d78e942cf36.slice/crio-205c73b1ea5a301df50c88c2833b1992d29a39f06232166d5125d802ffe3e979 WatchSource:0}: Error finding container 205c73b1ea5a301df50c88c2833b1992d29a39f06232166d5125d802ffe3e979: Status 404 returned error can't find the container with id 205c73b1ea5a301df50c88c2833b1992d29a39f06232166d5125d802ffe3e979 Mar 19 09:17:06.966592 master-0 kubenswrapper[3979]: I0319 09:17:06.966543 3979 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:17:07.017667 master-0 kubenswrapper[3979]: W0319 09:17:07.017547 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd664a6d0d2a24360dee10612610f1b59.slice/crio-ff9134bcfbd7c54799a9cf15d6a97a57adcccc8ff7840ea6e6628d638489256c WatchSource:0}: Error finding container ff9134bcfbd7c54799a9cf15d6a97a57adcccc8ff7840ea6e6628d638489256c: Status 404 returned error can't find the container with id ff9134bcfbd7c54799a9cf15d6a97a57adcccc8ff7840ea6e6628d638489256c Mar 19 09:17:07.020452 master-0 kubenswrapper[3979]: W0319 09:17:07.020406 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49fac1b46a11e49501805e891baae4a9.slice/crio-4a1c72bc9f1c7efb1bcafcf4f7660e88081a0397f913b23e1285005ab7524d43 WatchSource:0}: Error finding container 4a1c72bc9f1c7efb1bcafcf4f7660e88081a0397f913b23e1285005ab7524d43: Status 404 returned error can't find the container with id 4a1c72bc9f1c7efb1bcafcf4f7660e88081a0397f913b23e1285005ab7524d43 Mar 19 09:17:07.032837 master-0 kubenswrapper[3979]: E0319 09:17:07.032795 3979 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 09:17:07.039311 master-0 kubenswrapper[3979]: W0319 09:17:07.039144 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:07.039311 master-0 kubenswrapper[3979]: E0319 09:17:07.039307 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:07.056447 master-0 kubenswrapper[3979]: W0319 09:17:07.056381 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46f265536aba6292ead501bc9b49f327.slice/crio-628162d008ef66056da78d4bcff9fb80227ffcc627a246a21dbba2cd871accd4 WatchSource:0}: Error finding container 628162d008ef66056da78d4bcff9fb80227ffcc627a246a21dbba2cd871accd4: Status 404 returned error can't find the container with id 628162d008ef66056da78d4bcff9fb80227ffcc627a246a21dbba2cd871accd4 Mar 19 09:17:07.257517 master-0 kubenswrapper[3979]: I0319 09:17:07.257197 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:07.258595 master-0 kubenswrapper[3979]: I0319 09:17:07.258563 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:07.258717 master-0 kubenswrapper[3979]: I0319 09:17:07.258618 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:07.258717 master-0 kubenswrapper[3979]: I0319 09:17:07.258631 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:07.258717 master-0 kubenswrapper[3979]: I0319 09:17:07.258716 3979 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:07.259794 master-0 kubenswrapper[3979]: E0319 09:17:07.259721 3979 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:07.631118 master-0 kubenswrapper[3979]: I0319 09:17:07.631051 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:07.688482 master-0 kubenswrapper[3979]: I0319 09:17:07.688405 3979 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:17:07.703904 master-0 kubenswrapper[3979]: E0319 09:17:07.703837 3979 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:07.790148 master-0 kubenswrapper[3979]: I0319 09:17:07.789947 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"4a1c72bc9f1c7efb1bcafcf4f7660e88081a0397f913b23e1285005ab7524d43"} Mar 19 09:17:07.791252 master-0 kubenswrapper[3979]: I0319 09:17:07.790902 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"ff9134bcfbd7c54799a9cf15d6a97a57adcccc8ff7840ea6e6628d638489256c"} Mar 19 09:17:07.793233 master-0 kubenswrapper[3979]: I0319 09:17:07.792993 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"205c73b1ea5a301df50c88c2833b1992d29a39f06232166d5125d802ffe3e979"} Mar 19 09:17:07.794139 master-0 kubenswrapper[3979]: I0319 09:17:07.794093 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"5f0b7606a412dcca4dd370553910b12ad443e3587ee9a8d70a1100b889c51bbc"} Mar 19 09:17:07.795288 master-0 kubenswrapper[3979]: I0319 09:17:07.795253 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"628162d008ef66056da78d4bcff9fb80227ffcc627a246a21dbba2cd871accd4"} Mar 19 09:17:08.631633 master-0 kubenswrapper[3979]: I0319 09:17:08.631550 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:08.634243 master-0 kubenswrapper[3979]: E0319 09:17:08.634165 3979 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 09:17:08.859930 master-0 kubenswrapper[3979]: I0319 09:17:08.859856 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:08.861377 master-0 kubenswrapper[3979]: I0319 09:17:08.861338 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:08.861478 master-0 kubenswrapper[3979]: I0319 09:17:08.861397 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:08.861478 master-0 kubenswrapper[3979]: I0319 09:17:08.861412 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:08.861578 master-0 kubenswrapper[3979]: I0319 09:17:08.861513 3979 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:08.862897 master-0 kubenswrapper[3979]: E0319 09:17:08.862848 3979 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:08.887796 master-0 kubenswrapper[3979]: W0319 09:17:08.887578 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:08.887796 master-0 kubenswrapper[3979]: E0319 09:17:08.887696 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:08.963954 master-0 kubenswrapper[3979]: W0319 09:17:08.963823 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:08.964250 master-0 kubenswrapper[3979]: E0319 09:17:08.963973 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:09.069684 master-0 kubenswrapper[3979]: W0319 09:17:09.069587 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:09.069684 master-0 kubenswrapper[3979]: E0319 09:17:09.069680 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:09.205320 master-0 kubenswrapper[3979]: W0319 09:17:09.205130 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:09.205320 master-0 kubenswrapper[3979]: E0319 09:17:09.205239 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:09.389219 master-0 kubenswrapper[3979]: E0319 09:17:09.389062 3979 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e336824c93511 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.617335569 +0000 UTC m=+0.660323177,LastTimestamp:2026-03-19 09:17:05.617335569 +0000 UTC m=+0.660323177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:09.630915 master-0 kubenswrapper[3979]: I0319 09:17:09.630648 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:10.631151 master-0 kubenswrapper[3979]: I0319 09:17:10.631070 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:10.805421 master-0 kubenswrapper[3979]: I0319 09:17:10.805331 3979 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="e5e1897ddbf62a1e1975ee8d4b56ad3a8cd0b0cf3d4e0758eac825b5a75e9b66" exitCode=0 Mar 19 09:17:10.805731 master-0 kubenswrapper[3979]: I0319 09:17:10.805442 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"e5e1897ddbf62a1e1975ee8d4b56ad3a8cd0b0cf3d4e0758eac825b5a75e9b66"} Mar 19 09:17:10.805731 master-0 kubenswrapper[3979]: I0319 09:17:10.805499 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:10.807018 master-0 kubenswrapper[3979]: I0319 09:17:10.806974 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:10.807066 master-0 kubenswrapper[3979]: I0319 09:17:10.807023 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:10.807066 master-0 kubenswrapper[3979]: I0319 09:17:10.807047 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:10.807406 master-0 kubenswrapper[3979]: I0319 09:17:10.807360 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538"} Mar 19 09:17:10.807406 master-0 kubenswrapper[3979]: I0319 09:17:10.807404 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98"} Mar 19 09:17:10.807472 master-0 kubenswrapper[3979]: I0319 09:17:10.807438 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:10.808373 master-0 kubenswrapper[3979]: I0319 09:17:10.808135 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:10.808373 master-0 kubenswrapper[3979]: I0319 09:17:10.808165 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:10.808373 master-0 kubenswrapper[3979]: I0319 09:17:10.808174 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:11.631624 master-0 kubenswrapper[3979]: I0319 09:17:11.631558 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:11.811868 master-0 kubenswrapper[3979]: I0319 09:17:11.811795 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/0.log" Mar 19 09:17:11.812272 master-0 kubenswrapper[3979]: I0319 09:17:11.812235 3979 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="75b230851f4c56f1f87e439937d74e032148cf9c0606c0e61fcca1777e6dbf98" exitCode=1 Mar 19 09:17:11.812377 master-0 kubenswrapper[3979]: I0319 09:17:11.812336 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:11.812463 master-0 kubenswrapper[3979]: I0319 09:17:11.812350 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:11.812559 master-0 kubenswrapper[3979]: I0319 09:17:11.812330 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"75b230851f4c56f1f87e439937d74e032148cf9c0606c0e61fcca1777e6dbf98"} Mar 19 09:17:11.813412 master-0 kubenswrapper[3979]: I0319 09:17:11.813373 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:11.813473 master-0 kubenswrapper[3979]: I0319 09:17:11.813416 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:11.813473 master-0 kubenswrapper[3979]: I0319 09:17:11.813381 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:11.813473 master-0 kubenswrapper[3979]: I0319 09:17:11.813428 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:11.813473 master-0 kubenswrapper[3979]: I0319 09:17:11.813445 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:11.813473 master-0 kubenswrapper[3979]: I0319 09:17:11.813455 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:11.813897 master-0 kubenswrapper[3979]: I0319 09:17:11.813861 3979 scope.go:117] "RemoveContainer" containerID="75b230851f4c56f1f87e439937d74e032148cf9c0606c0e61fcca1777e6dbf98" Mar 19 09:17:11.835209 master-0 kubenswrapper[3979]: E0319 09:17:11.835135 3979 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 19 09:17:12.038292 master-0 kubenswrapper[3979]: I0319 09:17:12.038234 3979 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:17:12.039891 master-0 kubenswrapper[3979]: E0319 09:17:12.039845 3979 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:12.063503 master-0 kubenswrapper[3979]: I0319 09:17:12.063458 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:12.064369 master-0 kubenswrapper[3979]: I0319 09:17:12.064345 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:12.064429 master-0 kubenswrapper[3979]: I0319 09:17:12.064378 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:12.064429 master-0 kubenswrapper[3979]: I0319 09:17:12.064388 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:12.064496 master-0 kubenswrapper[3979]: I0319 09:17:12.064446 3979 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:12.065360 master-0 kubenswrapper[3979]: E0319 09:17:12.065328 3979 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:12.631471 master-0 kubenswrapper[3979]: I0319 09:17:12.631362 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:12.816927 master-0 kubenswrapper[3979]: I0319 09:17:12.816804 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/0.log" Mar 19 09:17:12.817586 master-0 kubenswrapper[3979]: I0319 09:17:12.817138 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"fbf135176e2a5048fc401c44235b11465f4467b7f638a9d3f3d0d58d2b613241"} Mar 19 09:17:12.817586 master-0 kubenswrapper[3979]: I0319 09:17:12.817266 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:12.818032 master-0 kubenswrapper[3979]: I0319 09:17:12.817990 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:12.818032 master-0 kubenswrapper[3979]: I0319 09:17:12.818023 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:12.818032 master-0 kubenswrapper[3979]: I0319 09:17:12.818035 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:13.084181 master-0 kubenswrapper[3979]: W0319 09:17:13.084048 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:13.084181 master-0 kubenswrapper[3979]: E0319 09:17:13.084115 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:13.144636 master-0 kubenswrapper[3979]: W0319 09:17:13.144547 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:13.144737 master-0 kubenswrapper[3979]: E0319 09:17:13.144659 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:13.631169 master-0 kubenswrapper[3979]: I0319 09:17:13.631066 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:13.759865 master-0 kubenswrapper[3979]: W0319 09:17:13.759760 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:13.759865 master-0 kubenswrapper[3979]: E0319 09:17:13.759835 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:13.819038 master-0 kubenswrapper[3979]: I0319 09:17:13.818966 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:13.819950 master-0 kubenswrapper[3979]: I0319 09:17:13.819897 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:13.819950 master-0 kubenswrapper[3979]: I0319 09:17:13.819948 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:13.820072 master-0 kubenswrapper[3979]: I0319 09:17:13.819957 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:14.631819 master-0 kubenswrapper[3979]: I0319 09:17:14.631736 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:14.806940 master-0 kubenswrapper[3979]: W0319 09:17:14.806833 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:14.806940 master-0 kubenswrapper[3979]: E0319 09:17:14.806899 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:15.631565 master-0 kubenswrapper[3979]: I0319 09:17:15.631456 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:15.748901 master-0 kubenswrapper[3979]: E0319 09:17:15.748808 3979 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:17:16.631222 master-0 kubenswrapper[3979]: I0319 09:17:16.631064 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:16.828051 master-0 kubenswrapper[3979]: I0319 09:17:16.827954 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 19 09:17:16.829748 master-0 kubenswrapper[3979]: I0319 09:17:16.829703 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/0.log" Mar 19 09:17:16.830295 master-0 kubenswrapper[3979]: I0319 09:17:16.830244 3979 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="fbf135176e2a5048fc401c44235b11465f4467b7f638a9d3f3d0d58d2b613241" exitCode=1 Mar 19 09:17:16.830361 master-0 kubenswrapper[3979]: I0319 09:17:16.830296 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"fbf135176e2a5048fc401c44235b11465f4467b7f638a9d3f3d0d58d2b613241"} Mar 19 09:17:16.830408 master-0 kubenswrapper[3979]: I0319 09:17:16.830370 3979 scope.go:117] "RemoveContainer" containerID="75b230851f4c56f1f87e439937d74e032148cf9c0606c0e61fcca1777e6dbf98" Mar 19 09:17:16.830549 master-0 kubenswrapper[3979]: I0319 09:17:16.830511 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:16.831293 master-0 kubenswrapper[3979]: I0319 09:17:16.831251 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:16.831293 master-0 kubenswrapper[3979]: I0319 09:17:16.831292 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:16.831386 master-0 kubenswrapper[3979]: I0319 09:17:16.831305 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:16.831705 master-0 kubenswrapper[3979]: I0319 09:17:16.831670 3979 scope.go:117] "RemoveContainer" containerID="fbf135176e2a5048fc401c44235b11465f4467b7f638a9d3f3d0d58d2b613241" Mar 19 09:17:16.831893 master-0 kubenswrapper[3979]: E0319 09:17:16.831849 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 09:17:17.632930 master-0 kubenswrapper[3979]: I0319 09:17:17.632844 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:17.835000 master-0 kubenswrapper[3979]: I0319 09:17:17.834275 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:17.835000 master-0 kubenswrapper[3979]: I0319 09:17:17.834299 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"9a59b0cbe8ea8fa4b17a290e74267cd3c1f43f118142de7e624d510bbb389da7"} Mar 19 09:17:17.836273 master-0 kubenswrapper[3979]: I0319 09:17:17.835716 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:17.836273 master-0 kubenswrapper[3979]: I0319 09:17:17.835764 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:17.836273 master-0 kubenswrapper[3979]: I0319 09:17:17.835777 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:17.836414 master-0 kubenswrapper[3979]: I0319 09:17:17.836308 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 19 09:17:17.838660 master-0 kubenswrapper[3979]: I0319 09:17:17.838609 3979 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="382712d4a8a720b54161d083c15e892932ef38c413a22bb647480e2f84ff33a9" exitCode=1 Mar 19 09:17:17.838723 master-0 kubenswrapper[3979]: I0319 09:17:17.838667 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"382712d4a8a720b54161d083c15e892932ef38c413a22bb647480e2f84ff33a9"} Mar 19 09:17:17.840415 master-0 kubenswrapper[3979]: I0319 09:17:17.840382 3979 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8" exitCode=0 Mar 19 09:17:17.840489 master-0 kubenswrapper[3979]: I0319 09:17:17.840417 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerDied","Data":"5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8"} Mar 19 09:17:17.840608 master-0 kubenswrapper[3979]: I0319 09:17:17.840576 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:17.841663 master-0 kubenswrapper[3979]: I0319 09:17:17.841618 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:17.841723 master-0 kubenswrapper[3979]: I0319 09:17:17.841695 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:17.841765 master-0 kubenswrapper[3979]: I0319 09:17:17.841721 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:17.845180 master-0 kubenswrapper[3979]: I0319 09:17:17.844881 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:17.845513 master-0 kubenswrapper[3979]: I0319 09:17:17.845483 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:17.845585 master-0 kubenswrapper[3979]: I0319 09:17:17.845560 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:17.845585 master-0 kubenswrapper[3979]: I0319 09:17:17.845581 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:18.237069 master-0 kubenswrapper[3979]: E0319 09:17:18.236781 3979 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 19 09:17:18.465816 master-0 kubenswrapper[3979]: I0319 09:17:18.465751 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:18.467032 master-0 kubenswrapper[3979]: I0319 09:17:18.466973 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:18.467032 master-0 kubenswrapper[3979]: I0319 09:17:18.467011 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:18.467032 master-0 kubenswrapper[3979]: I0319 09:17:18.467023 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:18.467265 master-0 kubenswrapper[3979]: I0319 09:17:18.467088 3979 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:18.467959 master-0 kubenswrapper[3979]: E0319 09:17:18.467930 3979 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:18.845771 master-0 kubenswrapper[3979]: I0319 09:17:18.845680 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34"} Mar 19 09:17:18.845771 master-0 kubenswrapper[3979]: I0319 09:17:18.845725 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:18.846729 master-0 kubenswrapper[3979]: I0319 09:17:18.846683 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:18.846729 master-0 kubenswrapper[3979]: I0319 09:17:18.846730 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:18.846873 master-0 kubenswrapper[3979]: I0319 09:17:18.846746 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:19.996734 master-0 kubenswrapper[3979]: I0319 09:17:19.996574 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:19.996734 master-0 kubenswrapper[3979]: E0319 09:17:19.996594 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336824c93511 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.617335569 +0000 UTC m=+0.660323177,LastTimestamp:2026-03-19 09:17:05.617335569 +0000 UTC m=+0.660323177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.007936 master-0 kubenswrapper[3979]: E0319 09:17:20.007154 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]" event="&Event{ObjectMeta:{master-0.189e336827affbd1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666014161 +0000 UTC m=+0.709001769,LastTimestamp:2026-03-19 09:17:05.666014161 +0000 UTC m=+0.709001769,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.015518 master-0 kubenswrapper[3979]: E0319 09:17:20.015359 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]" event="&Event{ObjectMeta:{master-0.189e336827b3948f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666249871 +0000 UTC m=+0.709237469,LastTimestamp:2026-03-19 09:17:05.666249871 +0000 UTC m=+0.709237469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.021558 master-0 kubenswrapper[3979]: E0319 09:17:20.021193 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b6c030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666457648 +0000 UTC m=+0.709445256,LastTimestamp:2026-03-19 09:17:05.666457648 +0000 UTC m=+0.709445256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.028037 master-0 kubenswrapper[3979]: E0319 09:17:20.027794 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e33682c96ab67 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.748241255 +0000 UTC m=+0.791228863,LastTimestamp:2026-03-19 09:17:05.748241255 +0000 UTC m=+0.791228863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.034067 master-0 kubenswrapper[3979]: E0319 09:17:20.033914 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827affbd1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827affbd1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666014161 +0000 UTC m=+0.709001769,LastTimestamp:2026-03-19 09:17:05.848027282 +0000 UTC m=+0.891014900,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.038788 master-0 kubenswrapper[3979]: E0319 09:17:20.038651 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b3948f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b3948f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666249871 +0000 UTC m=+0.709237469,LastTimestamp:2026-03-19 09:17:05.848068346 +0000 UTC m=+0.891055964,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.042855 master-0 kubenswrapper[3979]: E0319 09:17:20.042780 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b6c030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b6c030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666457648 +0000 UTC m=+0.709445256,LastTimestamp:2026-03-19 09:17:05.848087637 +0000 UTC m=+0.891075255,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.047665 master-0 kubenswrapper[3979]: E0319 09:17:20.047310 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827affbd1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827affbd1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666014161 +0000 UTC m=+0.709001769,LastTimestamp:2026-03-19 09:17:05.884370097 +0000 UTC m=+0.927357715,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.057507 master-0 kubenswrapper[3979]: E0319 09:17:20.056859 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b3948f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b3948f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666249871 +0000 UTC m=+0.709237469,LastTimestamp:2026-03-19 09:17:05.88439562 +0000 UTC m=+0.927383238,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.061891 master-0 kubenswrapper[3979]: E0319 09:17:20.061448 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b6c030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b6c030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666457648 +0000 UTC m=+0.709445256,LastTimestamp:2026-03-19 09:17:05.884412291 +0000 UTC m=+0.927399899,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.066267 master-0 kubenswrapper[3979]: E0319 09:17:20.066122 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827affbd1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827affbd1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666014161 +0000 UTC m=+0.709001769,LastTimestamp:2026-03-19 09:17:05.886071639 +0000 UTC m=+0.929059217,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.071204 master-0 kubenswrapper[3979]: E0319 09:17:20.071042 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827affbd1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827affbd1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666014161 +0000 UTC m=+0.709001769,LastTimestamp:2026-03-19 09:17:05.886125833 +0000 UTC m=+0.929113441,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.080786 master-0 kubenswrapper[3979]: E0319 09:17:20.080629 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b3948f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b3948f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666249871 +0000 UTC m=+0.709237469,LastTimestamp:2026-03-19 09:17:05.886154866 +0000 UTC m=+0.929142484,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.091710 master-0 kubenswrapper[3979]: E0319 09:17:20.087579 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b6c030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b6c030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666457648 +0000 UTC m=+0.709445256,LastTimestamp:2026-03-19 09:17:05.886174117 +0000 UTC m=+0.929161725,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.104068 master-0 kubenswrapper[3979]: E0319 09:17:20.103885 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b3948f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b3948f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666249871 +0000 UTC m=+0.709237469,LastTimestamp:2026-03-19 09:17:05.886230642 +0000 UTC m=+0.929218230,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.111127 master-0 kubenswrapper[3979]: E0319 09:17:20.111025 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b6c030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b6c030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666457648 +0000 UTC m=+0.709445256,LastTimestamp:2026-03-19 09:17:05.886279696 +0000 UTC m=+0.929267284,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.115985 master-0 kubenswrapper[3979]: E0319 09:17:20.115845 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827affbd1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827affbd1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666014161 +0000 UTC m=+0.709001769,LastTimestamp:2026-03-19 09:17:05.887387138 +0000 UTC m=+0.930374716,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.121125 master-0 kubenswrapper[3979]: E0319 09:17:20.120984 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827affbd1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827affbd1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666014161 +0000 UTC m=+0.709001769,LastTimestamp:2026-03-19 09:17:05.88740147 +0000 UTC m=+0.930389048,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.125450 master-0 kubenswrapper[3979]: E0319 09:17:20.125383 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b3948f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b3948f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666249871 +0000 UTC m=+0.709237469,LastTimestamp:2026-03-19 09:17:05.887415211 +0000 UTC m=+0.930402789,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.130267 master-0 kubenswrapper[3979]: E0319 09:17:20.130083 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b3948f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b3948f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666249871 +0000 UTC m=+0.709237469,LastTimestamp:2026-03-19 09:17:05.887418661 +0000 UTC m=+0.930406239,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.135174 master-0 kubenswrapper[3979]: E0319 09:17:20.135079 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b6c030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b6c030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666457648 +0000 UTC m=+0.709445256,LastTimestamp:2026-03-19 09:17:05.887425882 +0000 UTC m=+0.930413460,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.139459 master-0 kubenswrapper[3979]: E0319 09:17:20.139316 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b6c030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b6c030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666457648 +0000 UTC m=+0.709445256,LastTimestamp:2026-03-19 09:17:05.887492427 +0000 UTC m=+0.930480005,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.143597 master-0 kubenswrapper[3979]: E0319 09:17:20.143494 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827affbd1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827affbd1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666014161 +0000 UTC m=+0.709001769,LastTimestamp:2026-03-19 09:17:05.888319126 +0000 UTC m=+0.931306704,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.157563 master-0 kubenswrapper[3979]: E0319 09:17:20.156790 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336827b3948f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336827b3948f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.666249871 +0000 UTC m=+0.709237469,LastTimestamp:2026-03-19 09:17:05.888340158 +0000 UTC m=+0.931327736,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.169131 master-0 kubenswrapper[3979]: E0319 09:17:20.167305 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e33687532a2b8 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:06.9664222 +0000 UTC m=+2.009409778,LastTimestamp:2026-03-19 09:17:06.9664222 +0000 UTC m=+2.009409778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.194560 master-0 kubenswrapper[3979]: E0319 09:17:20.188055 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e33687863d633 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:07.019978291 +0000 UTC m=+2.062965879,LastTimestamp:2026-03-19 09:17:07.019978291 +0000 UTC m=+2.062965879,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.196653 master-0 kubenswrapper[3979]: E0319 09:17:20.196477 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e33687891b764 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:07.02298506 +0000 UTC m=+2.065972638,LastTimestamp:2026-03-19 09:17:07.02298506 +0000 UTC m=+2.065972638,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.204628 master-0 kubenswrapper[3979]: E0319 09:17:20.204463 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e33687abe3fab kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:07.059457963 +0000 UTC m=+2.102445541,LastTimestamp:2026-03-19 09:17:07.059457963 +0000 UTC m=+2.102445541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.214266 master-0 kubenswrapper[3979]: E0319 09:17:20.214094 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33687dcf6a23 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:07.110914595 +0000 UTC m=+2.153902193,LastTimestamp:2026-03-19 09:17:07.110914595 +0000 UTC m=+2.153902193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.220197 master-0 kubenswrapper[3979]: E0319 09:17:20.219996 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3369118cf7a0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" in 2.478s (2.478s including waiting). Image size: 465090934 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:09.589587872 +0000 UTC m=+4.632575450,LastTimestamp:2026-03-19 09:17:09.589587872 +0000 UTC m=+4.632575450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.225507 master-0 kubenswrapper[3979]: E0319 09:17:20.225381 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3369127d5a74 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\" in 2.585s (2.585s including waiting). Image size: 529326739 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:09.605341812 +0000 UTC m=+4.648329390,LastTimestamp:2026-03-19 09:17:09.605341812 +0000 UTC m=+4.648329390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.231691 master-0 kubenswrapper[3979]: E0319 09:17:20.231376 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3369212d6846 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:09.85176071 +0000 UTC m=+4.894748298,LastTimestamp:2026-03-19 09:17:09.85176071 +0000 UTC m=+4.894748298,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.237134 master-0 kubenswrapper[3979]: E0319 09:17:20.236974 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e336921380b93 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:09.852457875 +0000 UTC m=+4.895445463,LastTimestamp:2026-03-19 09:17:09.852457875 +0000 UTC m=+4.895445463,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.242209 master-0 kubenswrapper[3979]: E0319 09:17:20.242095 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e336922a800c5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:09.876572357 +0000 UTC m=+4.919559935,LastTimestamp:2026-03-19 09:17:09.876572357 +0000 UTC m=+4.919559935,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.247069 master-0 kubenswrapper[3979]: E0319 09:17:20.246835 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3369236ea64d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:09.889590861 +0000 UTC m=+4.932578439,LastTimestamp:2026-03-19 09:17:09.889590861 +0000 UTC m=+4.932578439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.251608 master-0 kubenswrapper[3979]: E0319 09:17:20.251489 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e336923de7a40 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:09.896919616 +0000 UTC m=+4.939907194,LastTimestamp:2026-03-19 09:17:09.896919616 +0000 UTC m=+4.939907194,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.256466 master-0 kubenswrapper[3979]: E0319 09:17:20.256279 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3369387e203b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:10.242926651 +0000 UTC m=+5.285914229,LastTimestamp:2026-03-19 09:17:10.242926651 +0000 UTC m=+5.285914229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.261282 master-0 kubenswrapper[3979]: E0319 09:17:20.261144 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e33693f3337e8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:10.355458024 +0000 UTC m=+5.398445622,LastTimestamp:2026-03-19 09:17:10.355458024 +0000 UTC m=+5.398445622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.268646 master-0 kubenswrapper[3979]: E0319 09:17:20.268478 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33695a447027 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:10.809571367 +0000 UTC m=+5.852558945,LastTimestamp:2026-03-19 09:17:10.809571367 +0000 UTC m=+5.852558945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.273803 master-0 kubenswrapper[3979]: E0319 09:17:20.273679 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e336968e37e9a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:11.054876314 +0000 UTC m=+6.097863892,LastTimestamp:2026-03-19 09:17:11.054876314 +0000 UTC m=+6.097863892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.278780 master-0 kubenswrapper[3979]: E0319 09:17:20.278671 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33696a86ab60 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:11.08234736 +0000 UTC m=+6.125334938,LastTimestamp:2026-03-19 09:17:11.08234736 +0000 UTC m=+6.125334938,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.283738 master-0 kubenswrapper[3979]: E0319 09:17:20.283634 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e33695a447027\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33695a447027 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:10.809571367 +0000 UTC m=+5.852558945,LastTimestamp:2026-03-19 09:17:11.81586191 +0000 UTC m=+6.858849488,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.288428 master-0 kubenswrapper[3979]: E0319 09:17:20.288276 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e336968e37e9a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e336968e37e9a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:11.054876314 +0000 UTC m=+6.097863892,LastTimestamp:2026-03-19 09:17:12.522198417 +0000 UTC m=+7.565186025,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.292299 master-0 kubenswrapper[3979]: E0319 09:17:20.292094 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e33696a86ab60\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33696a86ab60 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:11.08234736 +0000 UTC m=+6.125334938,LastTimestamp:2026-03-19 09:17:12.626894819 +0000 UTC m=+7.669882397,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.311745 master-0 kubenswrapper[3979]: E0319 09:17:20.311349 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336abb19942e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 9.706s (9.706s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:16.72911979 +0000 UTC m=+11.772107368,LastTimestamp:2026-03-19 09:17:16.72911979 +0000 UTC m=+11.772107368,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.323591 master-0 kubenswrapper[3979]: E0319 09:17:20.323423 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336abdb831ae kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 9.713s (9.713s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:16.77306923 +0000 UTC m=+11.816056808,LastTimestamp:2026-03-19 09:17:16.77306923 +0000 UTC m=+11.816056808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.330269 master-0 kubenswrapper[3979]: E0319 09:17:20.330190 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e336ac13897d9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:16.831815641 +0000 UTC m=+11.874803219,LastTimestamp:2026-03-19 09:17:16.831815641 +0000 UTC m=+11.874803219,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.334674 master-0 kubenswrapper[3979]: E0319 09:17:20.334541 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e336ac4bbcdd6 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 9.924s (9.924s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:16.890746326 +0000 UTC m=+11.933733894,LastTimestamp:2026-03-19 09:17:16.890746326 +0000 UTC m=+11.933733894,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.338697 master-0 kubenswrapper[3979]: E0319 09:17:20.338621 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336ac730df3f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:16.931972927 +0000 UTC m=+11.974960505,LastTimestamp:2026-03-19 09:17:16.931972927 +0000 UTC m=+11.974960505,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.344122 master-0 kubenswrapper[3979]: E0319 09:17:20.343954 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336ac7c37a22 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:16.941580834 +0000 UTC m=+11.984568402,LastTimestamp:2026-03-19 09:17:16.941580834 +0000 UTC m=+11.984568402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.348599 master-0 kubenswrapper[3979]: E0319 09:17:20.348477 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336ac8d96b0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:16.959795983 +0000 UTC m=+12.002783571,LastTimestamp:2026-03-19 09:17:16.959795983 +0000 UTC m=+12.002783571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.353215 master-0 kubenswrapper[3979]: E0319 09:17:20.353056 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336ac8fb9b50 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:16.96203656 +0000 UTC m=+12.005024138,LastTimestamp:2026-03-19 09:17:16.96203656 +0000 UTC m=+12.005024138,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.358787 master-0 kubenswrapper[3979]: E0319 09:17:20.358649 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336ac91430fd kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:16.963647741 +0000 UTC m=+12.006635319,LastTimestamp:2026-03-19 09:17:16.963647741 +0000 UTC m=+12.006635319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.363294 master-0 kubenswrapper[3979]: E0319 09:17:20.363221 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e336ad0e3cfba kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:17.094694842 +0000 UTC m=+12.137682420,LastTimestamp:2026-03-19 09:17:17.094694842 +0000 UTC m=+12.137682420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.367509 master-0 kubenswrapper[3979]: E0319 09:17:20.367361 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e336ad1eb74ca kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:17.111973066 +0000 UTC m=+12.154960644,LastTimestamp:2026-03-19 09:17:17.111973066 +0000 UTC m=+12.154960644,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.373129 master-0 kubenswrapper[3979]: E0319 09:17:20.372987 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336afd99a036 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:17.844807734 +0000 UTC m=+12.887795312,LastTimestamp:2026-03-19 09:17:17.844807734 +0000 UTC m=+12.887795312,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.379590 master-0 kubenswrapper[3979]: E0319 09:17:20.379430 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336b1edab760 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:18.402721632 +0000 UTC m=+13.445709250,LastTimestamp:2026-03-19 09:17:18.402721632 +0000 UTC m=+13.445709250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.385093 master-0 kubenswrapper[3979]: E0319 09:17:20.384964 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336b2bd51780 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:18.620456832 +0000 UTC m=+13.663444410,LastTimestamp:2026-03-19 09:17:18.620456832 +0000 UTC m=+13.663444410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.389866 master-0 kubenswrapper[3979]: E0319 09:17:20.389724 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336b2be90d2a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:18.621764906 +0000 UTC m=+13.664752494,LastTimestamp:2026-03-19 09:17:18.621764906 +0000 UTC m=+13.664752494,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.395284 master-0 kubenswrapper[3979]: E0319 09:17:20.395182 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336b6cf80097 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\" in 2.749s (2.749s including waiting). Image size: 505246690 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:19.713263767 +0000 UTC m=+14.756251335,LastTimestamp:2026-03-19 09:17:19.713263767 +0000 UTC m=+14.756251335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.400682 master-0 kubenswrapper[3979]: E0319 09:17:20.400601 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336b7a72196b kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:19.939369323 +0000 UTC m=+14.982356911,LastTimestamp:2026-03-19 09:17:19.939369323 +0000 UTC m=+14.982356911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.406724 master-0 kubenswrapper[3979]: E0319 09:17:20.406659 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336b7bc25b47 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:19.961406279 +0000 UTC m=+15.004393857,LastTimestamp:2026-03-19 09:17:19.961406279 +0000 UTC m=+15.004393857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.502879 master-0 kubenswrapper[3979]: I0319 09:17:20.501283 3979 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:17:20.521146 master-0 kubenswrapper[3979]: I0319 09:17:20.520981 3979 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 09:17:20.635560 master-0 kubenswrapper[3979]: I0319 09:17:20.634867 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:20.852568 master-0 kubenswrapper[3979]: I0319 09:17:20.852060 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5"} Mar 19 09:17:20.852568 master-0 kubenswrapper[3979]: I0319 09:17:20.852190 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:20.856486 master-0 kubenswrapper[3979]: I0319 09:17:20.854364 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:20.856486 master-0 kubenswrapper[3979]: I0319 09:17:20.854411 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:20.856486 master-0 kubenswrapper[3979]: I0319 09:17:20.854425 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:20.856486 master-0 kubenswrapper[3979]: I0319 09:17:20.854909 3979 scope.go:117] "RemoveContainer" containerID="382712d4a8a720b54161d083c15e892932ef38c413a22bb647480e2f84ff33a9" Mar 19 09:17:20.863864 master-0 kubenswrapper[3979]: E0319 09:17:20.863761 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336bb139c189 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:20.858423689 +0000 UTC m=+15.901411257,LastTimestamp:2026-03-19 09:17:20.858423689 +0000 UTC m=+15.901411257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:21.053546 master-0 kubenswrapper[3979]: W0319 09:17:21.053472 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 19 09:17:21.063804 master-0 kubenswrapper[3979]: E0319 09:17:21.053559 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 09:17:21.088117 master-0 kubenswrapper[3979]: E0319 09:17:21.087858 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.189e336ac7c37a22\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336ac7c37a22 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:16.941580834 +0000 UTC m=+11.984568402,LastTimestamp:2026-03-19 09:17:21.082057641 +0000 UTC m=+16.125045219,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:21.101117 master-0 kubenswrapper[3979]: E0319 09:17:21.100981 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.189e336ac8fb9b50\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336ac8fb9b50 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:16.96203656 +0000 UTC m=+12.005024138,LastTimestamp:2026-03-19 09:17:21.09603723 +0000 UTC m=+16.139024808,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:21.101735 master-0 kubenswrapper[3979]: I0319 09:17:21.101694 3979 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:21.101735 master-0 kubenswrapper[3979]: I0319 09:17:21.101736 3979 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:21.634852 master-0 kubenswrapper[3979]: I0319 09:17:21.634798 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:21.747633 master-0 kubenswrapper[3979]: E0319 09:17:21.747429 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336be5e0d886 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" in 3.119s (3.119s including waiting). Image size: 514984269 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:21.741789318 +0000 UTC m=+16.784776916,LastTimestamp:2026-03-19 09:17:21.741789318 +0000 UTC m=+16.784776916,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:21.857995 master-0 kubenswrapper[3979]: I0319 09:17:21.857935 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"a74f0437d5a92c82edd9e58f193503c363594aaca67bff5a5ae6fcd1a5a28477"} Mar 19 09:17:21.858281 master-0 kubenswrapper[3979]: I0319 09:17:21.858068 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:21.858774 master-0 kubenswrapper[3979]: I0319 09:17:21.858739 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:21.858774 master-0 kubenswrapper[3979]: I0319 09:17:21.858775 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:21.858856 master-0 kubenswrapper[3979]: I0319 09:17:21.858786 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:21.913703 master-0 kubenswrapper[3979]: I0319 09:17:21.913199 3979 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:21.921299 master-0 kubenswrapper[3979]: I0319 09:17:21.921229 3979 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:22.055811 master-0 kubenswrapper[3979]: E0319 09:17:22.055618 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336bf8218730 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:22.048018224 +0000 UTC m=+17.091005812,LastTimestamp:2026-03-19 09:17:22.048018224 +0000 UTC m=+17.091005812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:22.074413 master-0 kubenswrapper[3979]: E0319 09:17:22.074213 3979 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336bf94ff3cf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:22.067837903 +0000 UTC m=+17.110825481,LastTimestamp:2026-03-19 09:17:22.067837903 +0000 UTC m=+17.110825481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:22.637227 master-0 kubenswrapper[3979]: I0319 09:17:22.637138 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:22.862351 master-0 kubenswrapper[3979]: I0319 09:17:22.862265 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:22.862795 master-0 kubenswrapper[3979]: I0319 09:17:22.862754 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:22.863022 master-0 kubenswrapper[3979]: I0319 09:17:22.862971 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0"} Mar 19 09:17:22.863022 master-0 kubenswrapper[3979]: I0319 09:17:22.863017 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:22.863416 master-0 kubenswrapper[3979]: I0319 09:17:22.863371 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:22.863416 master-0 kubenswrapper[3979]: I0319 09:17:22.863400 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:22.863416 master-0 kubenswrapper[3979]: I0319 09:17:22.863411 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:22.864227 master-0 kubenswrapper[3979]: I0319 09:17:22.864164 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:22.864227 master-0 kubenswrapper[3979]: I0319 09:17:22.864190 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:22.864227 master-0 kubenswrapper[3979]: I0319 09:17:22.864201 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:23.639059 master-0 kubenswrapper[3979]: I0319 09:17:23.638983 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:23.864640 master-0 kubenswrapper[3979]: I0319 09:17:23.864554 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:23.864640 master-0 kubenswrapper[3979]: I0319 09:17:23.864638 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:23.865649 master-0 kubenswrapper[3979]: I0319 09:17:23.865581 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:23.865737 master-0 kubenswrapper[3979]: I0319 09:17:23.865659 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:23.865737 master-0 kubenswrapper[3979]: I0319 09:17:23.865681 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:23.865836 master-0 kubenswrapper[3979]: I0319 09:17:23.865809 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:23.865889 master-0 kubenswrapper[3979]: I0319 09:17:23.865852 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:23.865889 master-0 kubenswrapper[3979]: I0319 09:17:23.865874 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:23.958565 master-0 kubenswrapper[3979]: W0319 09:17:23.958301 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 19 09:17:23.958565 master-0 kubenswrapper[3979]: E0319 09:17:23.958370 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 09:17:24.085518 master-0 kubenswrapper[3979]: I0319 09:17:24.085397 3979 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:24.093003 master-0 kubenswrapper[3979]: I0319 09:17:24.092937 3979 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:24.218955 master-0 kubenswrapper[3979]: W0319 09:17:24.218797 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:24.218955 master-0 kubenswrapper[3979]: E0319 09:17:24.218889 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 09:17:24.635242 master-0 kubenswrapper[3979]: I0319 09:17:24.635158 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:24.867056 master-0 kubenswrapper[3979]: I0319 09:17:24.866969 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:24.867643 master-0 kubenswrapper[3979]: I0319 09:17:24.867205 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:24.868237 master-0 kubenswrapper[3979]: I0319 09:17:24.868197 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:24.868311 master-0 kubenswrapper[3979]: I0319 09:17:24.868249 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:24.868311 master-0 kubenswrapper[3979]: I0319 09:17:24.868271 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:24.872939 master-0 kubenswrapper[3979]: I0319 09:17:24.872836 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:25.107566 master-0 kubenswrapper[3979]: I0319 09:17:25.107463 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:25.107840 master-0 kubenswrapper[3979]: I0319 09:17:25.107737 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:25.109330 master-0 kubenswrapper[3979]: I0319 09:17:25.109242 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:25.109330 master-0 kubenswrapper[3979]: I0319 09:17:25.109316 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:25.109592 master-0 kubenswrapper[3979]: I0319 09:17:25.109347 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:25.245706 master-0 kubenswrapper[3979]: E0319 09:17:25.245625 3979 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 09:17:25.381705 master-0 kubenswrapper[3979]: W0319 09:17:25.381466 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 19 09:17:25.381705 master-0 kubenswrapper[3979]: E0319 09:17:25.381604 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 09:17:25.468953 master-0 kubenswrapper[3979]: I0319 09:17:25.468826 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:25.470322 master-0 kubenswrapper[3979]: I0319 09:17:25.470257 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:25.470322 master-0 kubenswrapper[3979]: I0319 09:17:25.470305 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:25.470322 master-0 kubenswrapper[3979]: I0319 09:17:25.470317 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:25.470686 master-0 kubenswrapper[3979]: I0319 09:17:25.470384 3979 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:25.477135 master-0 kubenswrapper[3979]: E0319 09:17:25.477078 3979 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 19 09:17:25.638441 master-0 kubenswrapper[3979]: I0319 09:17:25.638207 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:25.750172 master-0 kubenswrapper[3979]: E0319 09:17:25.750027 3979 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:17:25.869719 master-0 kubenswrapper[3979]: I0319 09:17:25.869490 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:25.871039 master-0 kubenswrapper[3979]: I0319 09:17:25.870768 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:25.871039 master-0 kubenswrapper[3979]: I0319 09:17:25.870833 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:25.871039 master-0 kubenswrapper[3979]: I0319 09:17:25.870846 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:26.646718 master-0 kubenswrapper[3979]: I0319 09:17:26.646639 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:26.871490 master-0 kubenswrapper[3979]: I0319 09:17:26.871418 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:26.872400 master-0 kubenswrapper[3979]: I0319 09:17:26.872356 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:26.872454 master-0 kubenswrapper[3979]: I0319 09:17:26.872403 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:26.872454 master-0 kubenswrapper[3979]: I0319 09:17:26.872421 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:27.637327 master-0 kubenswrapper[3979]: I0319 09:17:27.637205 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:28.638222 master-0 kubenswrapper[3979]: I0319 09:17:28.638120 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:29.639024 master-0 kubenswrapper[3979]: I0319 09:17:29.638769 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:29.782562 master-0 kubenswrapper[3979]: I0319 09:17:29.782468 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:29.784003 master-0 kubenswrapper[3979]: I0319 09:17:29.783954 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:29.784003 master-0 kubenswrapper[3979]: I0319 09:17:29.784001 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:29.784116 master-0 kubenswrapper[3979]: I0319 09:17:29.784015 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:29.784409 master-0 kubenswrapper[3979]: I0319 09:17:29.784378 3979 scope.go:117] "RemoveContainer" containerID="fbf135176e2a5048fc401c44235b11465f4467b7f638a9d3f3d0d58d2b613241" Mar 19 09:17:29.794756 master-0 kubenswrapper[3979]: E0319 09:17:29.794588 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e33695a447027\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33695a447027 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:10.809571367 +0000 UTC m=+5.852558945,LastTimestamp:2026-03-19 09:17:29.787122569 +0000 UTC m=+24.830110147,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:30.041515 master-0 kubenswrapper[3979]: E0319 09:17:30.039424 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e336968e37e9a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e336968e37e9a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:11.054876314 +0000 UTC m=+6.097863892,LastTimestamp:2026-03-19 09:17:30.031815582 +0000 UTC m=+25.074803170,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:30.052610 master-0 kubenswrapper[3979]: E0319 09:17:30.052458 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e33696a86ab60\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33696a86ab60 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:11.08234736 +0000 UTC m=+6.125334938,LastTimestamp:2026-03-19 09:17:30.045905984 +0000 UTC m=+25.088893572,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:30.062645 master-0 kubenswrapper[3979]: I0319 09:17:30.062574 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:30.062767 master-0 kubenswrapper[3979]: I0319 09:17:30.062721 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:30.064179 master-0 kubenswrapper[3979]: I0319 09:17:30.064106 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:30.064179 master-0 kubenswrapper[3979]: I0319 09:17:30.064158 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:30.064179 master-0 kubenswrapper[3979]: I0319 09:17:30.064178 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:30.639170 master-0 kubenswrapper[3979]: I0319 09:17:30.639071 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:30.740268 master-0 kubenswrapper[3979]: I0319 09:17:30.740101 3979 csr.go:261] certificate signing request csr-zts2h is approved, waiting to be issued Mar 19 09:17:30.887495 master-0 kubenswrapper[3979]: I0319 09:17:30.887433 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 09:17:30.888586 master-0 kubenswrapper[3979]: I0319 09:17:30.888392 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 19 09:17:30.889299 master-0 kubenswrapper[3979]: I0319 09:17:30.889202 3979 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="ff92ed2afc9866bcdca7010b112a4b7e2fe7402710ba37be20aa1e6f3111dc9b" exitCode=1 Mar 19 09:17:30.889299 master-0 kubenswrapper[3979]: I0319 09:17:30.889262 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"ff92ed2afc9866bcdca7010b112a4b7e2fe7402710ba37be20aa1e6f3111dc9b"} Mar 19 09:17:30.889439 master-0 kubenswrapper[3979]: I0319 09:17:30.889319 3979 scope.go:117] "RemoveContainer" containerID="fbf135176e2a5048fc401c44235b11465f4467b7f638a9d3f3d0d58d2b613241" Mar 19 09:17:30.889619 master-0 kubenswrapper[3979]: I0319 09:17:30.889560 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:30.892049 master-0 kubenswrapper[3979]: I0319 09:17:30.891587 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:30.892049 master-0 kubenswrapper[3979]: I0319 09:17:30.891642 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:30.892049 master-0 kubenswrapper[3979]: I0319 09:17:30.891662 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:30.892324 master-0 kubenswrapper[3979]: I0319 09:17:30.892238 3979 scope.go:117] "RemoveContainer" containerID="ff92ed2afc9866bcdca7010b112a4b7e2fe7402710ba37be20aa1e6f3111dc9b" Mar 19 09:17:30.892600 master-0 kubenswrapper[3979]: E0319 09:17:30.892554 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 09:17:30.903777 master-0 kubenswrapper[3979]: E0319 09:17:30.903490 3979 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e336ac13897d9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e336ac13897d9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:16.831815641 +0000 UTC m=+11.874803219,LastTimestamp:2026-03-19 09:17:30.892461017 +0000 UTC m=+25.935448635,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:31.102515 master-0 kubenswrapper[3979]: I0319 09:17:31.102337 3979 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:31.102968 master-0 kubenswrapper[3979]: I0319 09:17:31.102614 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:31.105040 master-0 kubenswrapper[3979]: I0319 09:17:31.104876 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:31.105040 master-0 kubenswrapper[3979]: I0319 09:17:31.104936 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:31.105040 master-0 kubenswrapper[3979]: I0319 09:17:31.104957 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:31.109560 master-0 kubenswrapper[3979]: I0319 09:17:31.109481 3979 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:31.637854 master-0 kubenswrapper[3979]: I0319 09:17:31.637802 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:31.895838 master-0 kubenswrapper[3979]: I0319 09:17:31.895637 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 09:17:31.896659 master-0 kubenswrapper[3979]: I0319 09:17:31.896357 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:31.897948 master-0 kubenswrapper[3979]: I0319 09:17:31.897873 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:31.897948 master-0 kubenswrapper[3979]: I0319 09:17:31.897944 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:31.898174 master-0 kubenswrapper[3979]: I0319 09:17:31.897968 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:31.902953 master-0 kubenswrapper[3979]: I0319 09:17:31.902890 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:32.254623 master-0 kubenswrapper[3979]: E0319 09:17:32.254390 3979 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 09:17:32.478137 master-0 kubenswrapper[3979]: I0319 09:17:32.477974 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:32.479910 master-0 kubenswrapper[3979]: I0319 09:17:32.479839 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:32.479910 master-0 kubenswrapper[3979]: I0319 09:17:32.479904 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:32.480153 master-0 kubenswrapper[3979]: I0319 09:17:32.479924 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:32.480153 master-0 kubenswrapper[3979]: I0319 09:17:32.480003 3979 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:32.489335 master-0 kubenswrapper[3979]: E0319 09:17:32.489233 3979 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 19 09:17:32.639594 master-0 kubenswrapper[3979]: I0319 09:17:32.639446 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:32.899818 master-0 kubenswrapper[3979]: I0319 09:17:32.899653 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:32.902298 master-0 kubenswrapper[3979]: I0319 09:17:32.902218 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:32.902449 master-0 kubenswrapper[3979]: I0319 09:17:32.902305 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:32.902449 master-0 kubenswrapper[3979]: I0319 09:17:32.902324 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:33.638950 master-0 kubenswrapper[3979]: I0319 09:17:33.638864 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:34.639417 master-0 kubenswrapper[3979]: I0319 09:17:34.639320 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:35.637674 master-0 kubenswrapper[3979]: I0319 09:17:35.637591 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:35.750583 master-0 kubenswrapper[3979]: E0319 09:17:35.750355 3979 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:17:36.638835 master-0 kubenswrapper[3979]: I0319 09:17:36.638737 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:37.635054 master-0 kubenswrapper[3979]: I0319 09:17:37.634978 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:38.637522 master-0 kubenswrapper[3979]: I0319 09:17:38.637464 3979 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:39.140438 master-0 kubenswrapper[3979]: W0319 09:17:39.140348 3979 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 19 09:17:39.140688 master-0 kubenswrapper[3979]: E0319 09:17:39.140464 3979 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 09:17:39.260240 master-0 kubenswrapper[3979]: E0319 09:17:39.260158 3979 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 09:17:39.479683 master-0 kubenswrapper[3979]: I0319 09:17:39.479542 3979 csr.go:257] certificate signing request csr-zts2h is issued Mar 19 09:17:39.489677 master-0 kubenswrapper[3979]: I0319 09:17:39.489570 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:39.491342 master-0 kubenswrapper[3979]: I0319 09:17:39.491298 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:39.491433 master-0 kubenswrapper[3979]: I0319 09:17:39.491355 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:39.491433 master-0 kubenswrapper[3979]: I0319 09:17:39.491369 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:39.491433 master-0 kubenswrapper[3979]: I0319 09:17:39.491431 3979 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:39.496439 master-0 kubenswrapper[3979]: I0319 09:17:39.496395 3979 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 09:17:39.496610 master-0 kubenswrapper[3979]: E0319 09:17:39.496568 3979 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": read tcp 192.168.32.10:51936->192.168.32.10:6443: use of closed network connection" node="master-0" Mar 19 09:17:39.648803 master-0 kubenswrapper[3979]: I0319 09:17:39.648741 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:39.667701 master-0 kubenswrapper[3979]: I0319 09:17:39.667650 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:39.725584 master-0 kubenswrapper[3979]: I0319 09:17:39.725516 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:39.993872 master-0 kubenswrapper[3979]: I0319 09:17:39.993813 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:39.993872 master-0 kubenswrapper[3979]: E0319 09:17:39.993873 3979 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 09:17:40.017585 master-0 kubenswrapper[3979]: I0319 09:17:40.017489 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:40.035578 master-0 kubenswrapper[3979]: I0319 09:17:40.035487 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:40.096477 master-0 kubenswrapper[3979]: I0319 09:17:40.096410 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:40.274594 master-0 kubenswrapper[3979]: I0319 09:17:40.274492 3979 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:17:40.355758 master-0 kubenswrapper[3979]: I0319 09:17:40.355714 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:40.355758 master-0 kubenswrapper[3979]: E0319 09:17:40.355752 3979 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 09:17:40.454696 master-0 kubenswrapper[3979]: I0319 09:17:40.454644 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:40.470560 master-0 kubenswrapper[3979]: I0319 09:17:40.470476 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:40.480845 master-0 kubenswrapper[3979]: I0319 09:17:40.480773 3979 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-20 09:08:48 +0000 UTC, rotation deadline is 2026-03-20 02:02:53.34479869 +0000 UTC Mar 19 09:17:40.480845 master-0 kubenswrapper[3979]: I0319 09:17:40.480838 3979 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 16h45m12.863965196s for next certificate rotation Mar 19 09:17:40.526671 master-0 kubenswrapper[3979]: I0319 09:17:40.526481 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:40.796908 master-0 kubenswrapper[3979]: I0319 09:17:40.795324 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:40.796908 master-0 kubenswrapper[3979]: E0319 09:17:40.795372 3979 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 09:17:41.010589 master-0 kubenswrapper[3979]: I0319 09:17:41.010498 3979 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:17:41.363898 master-0 kubenswrapper[3979]: I0319 09:17:41.363814 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:41.379663 master-0 kubenswrapper[3979]: I0319 09:17:41.379547 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:41.477143 master-0 kubenswrapper[3979]: I0319 09:17:41.477068 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:41.608482 master-0 kubenswrapper[3979]: I0319 09:17:41.608394 3979 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:17:41.782895 master-0 kubenswrapper[3979]: I0319 09:17:41.782811 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:41.782895 master-0 kubenswrapper[3979]: E0319 09:17:41.782887 3979 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 09:17:42.782672 master-0 kubenswrapper[3979]: I0319 09:17:42.782491 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:42.784292 master-0 kubenswrapper[3979]: I0319 09:17:42.784220 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:42.784489 master-0 kubenswrapper[3979]: I0319 09:17:42.784399 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:42.784652 master-0 kubenswrapper[3979]: I0319 09:17:42.784580 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:42.785257 master-0 kubenswrapper[3979]: I0319 09:17:42.785196 3979 scope.go:117] "RemoveContainer" containerID="ff92ed2afc9866bcdca7010b112a4b7e2fe7402710ba37be20aa1e6f3111dc9b" Mar 19 09:17:42.785596 master-0 kubenswrapper[3979]: E0319 09:17:42.785515 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 09:17:45.751808 master-0 kubenswrapper[3979]: E0319 09:17:45.751735 3979 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:17:45.961319 master-0 kubenswrapper[3979]: I0319 09:17:45.961224 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:46.191358 master-0 kubenswrapper[3979]: I0319 09:17:46.191256 3979 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:46.497856 master-0 kubenswrapper[3979]: I0319 09:17:46.497623 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:46.499093 master-0 kubenswrapper[3979]: I0319 09:17:46.499069 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:46.499166 master-0 kubenswrapper[3979]: I0319 09:17:46.499107 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:46.499166 master-0 kubenswrapper[3979]: I0319 09:17:46.499118 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:46.499239 master-0 kubenswrapper[3979]: I0319 09:17:46.499178 3979 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:46.653823 master-0 kubenswrapper[3979]: E0319 09:17:46.653755 3979 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Mar 19 09:17:46.795333 master-0 kubenswrapper[3979]: I0319 09:17:46.795253 3979 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 19 09:17:46.795333 master-0 kubenswrapper[3979]: E0319 09:17:46.795316 3979 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 19 09:17:46.810979 master-0 kubenswrapper[3979]: E0319 09:17:46.810899 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:46.911615 master-0 kubenswrapper[3979]: E0319 09:17:46.911511 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:47.011979 master-0 kubenswrapper[3979]: E0319 09:17:47.011798 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:47.112789 master-0 kubenswrapper[3979]: E0319 09:17:47.112571 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:47.213198 master-0 kubenswrapper[3979]: E0319 09:17:47.213090 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:47.313978 master-0 kubenswrapper[3979]: E0319 09:17:47.313861 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:47.414936 master-0 kubenswrapper[3979]: E0319 09:17:47.414712 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:47.515801 master-0 kubenswrapper[3979]: E0319 09:17:47.515651 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:47.616835 master-0 kubenswrapper[3979]: E0319 09:17:47.616714 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:47.667972 master-0 kubenswrapper[3979]: I0319 09:17:47.667722 3979 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 09:17:47.681092 master-0 kubenswrapper[3979]: I0319 09:17:47.681025 3979 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 09:17:47.716917 master-0 kubenswrapper[3979]: E0319 09:17:47.716837 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:47.817600 master-0 kubenswrapper[3979]: E0319 09:17:47.817474 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:47.918771 master-0 kubenswrapper[3979]: E0319 09:17:47.918566 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:48.019720 master-0 kubenswrapper[3979]: E0319 09:17:48.019598 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:48.120483 master-0 kubenswrapper[3979]: E0319 09:17:48.120388 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:48.221587 master-0 kubenswrapper[3979]: E0319 09:17:48.221402 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:48.322341 master-0 kubenswrapper[3979]: E0319 09:17:48.322214 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:48.423058 master-0 kubenswrapper[3979]: E0319 09:17:48.422946 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:48.524010 master-0 kubenswrapper[3979]: E0319 09:17:48.523883 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:48.624122 master-0 kubenswrapper[3979]: E0319 09:17:48.624015 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:48.724330 master-0 kubenswrapper[3979]: E0319 09:17:48.724233 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:48.824522 master-0 kubenswrapper[3979]: E0319 09:17:48.824380 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:48.925153 master-0 kubenswrapper[3979]: E0319 09:17:48.925029 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:49.025579 master-0 kubenswrapper[3979]: E0319 09:17:49.025415 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:49.126567 master-0 kubenswrapper[3979]: E0319 09:17:49.126360 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:49.227518 master-0 kubenswrapper[3979]: E0319 09:17:49.226972 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:49.327575 master-0 kubenswrapper[3979]: E0319 09:17:49.327481 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:49.428324 master-0 kubenswrapper[3979]: E0319 09:17:49.428154 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:49.529111 master-0 kubenswrapper[3979]: E0319 09:17:49.528991 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:49.630109 master-0 kubenswrapper[3979]: E0319 09:17:49.629999 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:49.730775 master-0 kubenswrapper[3979]: E0319 09:17:49.730619 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:49.830811 master-0 kubenswrapper[3979]: E0319 09:17:49.830729 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:49.931679 master-0 kubenswrapper[3979]: E0319 09:17:49.931593 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:50.032063 master-0 kubenswrapper[3979]: E0319 09:17:50.031983 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:50.132553 master-0 kubenswrapper[3979]: E0319 09:17:50.132481 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:50.233637 master-0 kubenswrapper[3979]: E0319 09:17:50.233558 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:50.333839 master-0 kubenswrapper[3979]: E0319 09:17:50.333692 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:50.434663 master-0 kubenswrapper[3979]: E0319 09:17:50.434515 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:50.535092 master-0 kubenswrapper[3979]: E0319 09:17:50.535027 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:50.635449 master-0 kubenswrapper[3979]: E0319 09:17:50.635286 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:50.736433 master-0 kubenswrapper[3979]: E0319 09:17:50.736343 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:50.837252 master-0 kubenswrapper[3979]: E0319 09:17:50.837141 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:50.938463 master-0 kubenswrapper[3979]: E0319 09:17:50.938266 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:51.039100 master-0 kubenswrapper[3979]: E0319 09:17:51.039000 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:51.139560 master-0 kubenswrapper[3979]: E0319 09:17:51.139478 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:51.240414 master-0 kubenswrapper[3979]: E0319 09:17:51.240263 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:51.340710 master-0 kubenswrapper[3979]: E0319 09:17:51.340588 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:51.441290 master-0 kubenswrapper[3979]: E0319 09:17:51.441202 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:51.535112 master-0 kubenswrapper[3979]: I0319 09:17:51.535045 3979 csr.go:261] certificate signing request csr-z5pw9 is approved, waiting to be issued Mar 19 09:17:51.542418 master-0 kubenswrapper[3979]: E0319 09:17:51.542366 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:51.542832 master-0 kubenswrapper[3979]: I0319 09:17:51.542779 3979 csr.go:257] certificate signing request csr-z5pw9 is issued Mar 19 09:17:51.642570 master-0 kubenswrapper[3979]: E0319 09:17:51.642491 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:51.743425 master-0 kubenswrapper[3979]: E0319 09:17:51.743279 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:51.843642 master-0 kubenswrapper[3979]: E0319 09:17:51.843447 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:51.943732 master-0 kubenswrapper[3979]: E0319 09:17:51.943625 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:52.044189 master-0 kubenswrapper[3979]: E0319 09:17:52.044052 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:52.144631 master-0 kubenswrapper[3979]: E0319 09:17:52.144435 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:52.245428 master-0 kubenswrapper[3979]: E0319 09:17:52.245332 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:52.345845 master-0 kubenswrapper[3979]: E0319 09:17:52.345764 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:52.446868 master-0 kubenswrapper[3979]: E0319 09:17:52.446702 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:52.544771 master-0 kubenswrapper[3979]: I0319 09:17:52.544641 3979 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 09:08:48 +0000 UTC, rotation deadline is 2026-03-20 02:29:37.187845902 +0000 UTC Mar 19 09:17:52.544771 master-0 kubenswrapper[3979]: I0319 09:17:52.544700 3979 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h11m44.643150844s for next certificate rotation Mar 19 09:17:52.547923 master-0 kubenswrapper[3979]: E0319 09:17:52.547846 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:52.648754 master-0 kubenswrapper[3979]: E0319 09:17:52.648665 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:52.748960 master-0 kubenswrapper[3979]: E0319 09:17:52.748813 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:52.849252 master-0 kubenswrapper[3979]: E0319 09:17:52.849189 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:52.950250 master-0 kubenswrapper[3979]: E0319 09:17:52.950135 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:53.051239 master-0 kubenswrapper[3979]: E0319 09:17:53.051108 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:53.151748 master-0 kubenswrapper[3979]: E0319 09:17:53.151626 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:53.252977 master-0 kubenswrapper[3979]: E0319 09:17:53.252844 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:53.354003 master-0 kubenswrapper[3979]: E0319 09:17:53.353727 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:53.454553 master-0 kubenswrapper[3979]: E0319 09:17:53.454457 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:53.545696 master-0 kubenswrapper[3979]: I0319 09:17:53.545580 3979 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 09:08:48 +0000 UTC, rotation deadline is 2026-03-20 03:48:19.418536758 +0000 UTC Mar 19 09:17:53.545696 master-0 kubenswrapper[3979]: I0319 09:17:53.545648 3979 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h30m25.872893877s for next certificate rotation Mar 19 09:17:53.555073 master-0 kubenswrapper[3979]: E0319 09:17:53.554988 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:53.655574 master-0 kubenswrapper[3979]: E0319 09:17:53.655371 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:53.756314 master-0 kubenswrapper[3979]: E0319 09:17:53.756232 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:53.856903 master-0 kubenswrapper[3979]: E0319 09:17:53.856801 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:53.957958 master-0 kubenswrapper[3979]: E0319 09:17:53.957791 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:54.059039 master-0 kubenswrapper[3979]: E0319 09:17:54.058938 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:54.159551 master-0 kubenswrapper[3979]: E0319 09:17:54.159443 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:54.260810 master-0 kubenswrapper[3979]: E0319 09:17:54.260634 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:54.361468 master-0 kubenswrapper[3979]: E0319 09:17:54.361393 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:54.462634 master-0 kubenswrapper[3979]: E0319 09:17:54.462514 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:54.562830 master-0 kubenswrapper[3979]: E0319 09:17:54.562731 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:54.663352 master-0 kubenswrapper[3979]: E0319 09:17:54.663231 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:54.764414 master-0 kubenswrapper[3979]: E0319 09:17:54.764311 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:54.865107 master-0 kubenswrapper[3979]: E0319 09:17:54.864954 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:54.965772 master-0 kubenswrapper[3979]: E0319 09:17:54.965702 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:55.065852 master-0 kubenswrapper[3979]: E0319 09:17:55.065786 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:55.166705 master-0 kubenswrapper[3979]: E0319 09:17:55.166554 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:55.267702 master-0 kubenswrapper[3979]: E0319 09:17:55.267635 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:55.368511 master-0 kubenswrapper[3979]: E0319 09:17:55.368450 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:55.469288 master-0 kubenswrapper[3979]: E0319 09:17:55.469132 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:55.569417 master-0 kubenswrapper[3979]: E0319 09:17:55.569333 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:55.670289 master-0 kubenswrapper[3979]: E0319 09:17:55.670199 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:55.753058 master-0 kubenswrapper[3979]: E0319 09:17:55.752835 3979 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:17:55.771346 master-0 kubenswrapper[3979]: E0319 09:17:55.771301 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:55.872032 master-0 kubenswrapper[3979]: E0319 09:17:55.871919 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:55.972799 master-0 kubenswrapper[3979]: E0319 09:17:55.972725 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:56.073883 master-0 kubenswrapper[3979]: E0319 09:17:56.073795 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:56.174916 master-0 kubenswrapper[3979]: E0319 09:17:56.174825 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:56.275874 master-0 kubenswrapper[3979]: E0319 09:17:56.275799 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:56.377184 master-0 kubenswrapper[3979]: E0319 09:17:56.376948 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:56.477580 master-0 kubenswrapper[3979]: E0319 09:17:56.477436 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:56.577940 master-0 kubenswrapper[3979]: E0319 09:17:56.577831 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:56.678641 master-0 kubenswrapper[3979]: E0319 09:17:56.678360 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:56.778916 master-0 kubenswrapper[3979]: E0319 09:17:56.778810 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:56.782457 master-0 kubenswrapper[3979]: I0319 09:17:56.782401 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:56.784415 master-0 kubenswrapper[3979]: I0319 09:17:56.784371 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:56.784510 master-0 kubenswrapper[3979]: I0319 09:17:56.784433 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:56.784510 master-0 kubenswrapper[3979]: I0319 09:17:56.784450 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:56.784903 master-0 kubenswrapper[3979]: I0319 09:17:56.784860 3979 scope.go:117] "RemoveContainer" containerID="ff92ed2afc9866bcdca7010b112a4b7e2fe7402710ba37be20aa1e6f3111dc9b" Mar 19 09:17:56.879580 master-0 kubenswrapper[3979]: E0319 09:17:56.879486 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:56.980666 master-0 kubenswrapper[3979]: E0319 09:17:56.980608 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:57.081749 master-0 kubenswrapper[3979]: E0319 09:17:57.081651 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:57.082851 master-0 kubenswrapper[3979]: E0319 09:17:57.082760 3979 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 19 09:17:57.182316 master-0 kubenswrapper[3979]: E0319 09:17:57.182201 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:57.283369 master-0 kubenswrapper[3979]: E0319 09:17:57.283252 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:57.384290 master-0 kubenswrapper[3979]: E0319 09:17:57.384138 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:57.485128 master-0 kubenswrapper[3979]: E0319 09:17:57.485007 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:57.585681 master-0 kubenswrapper[3979]: E0319 09:17:57.585447 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:57.686274 master-0 kubenswrapper[3979]: E0319 09:17:57.686156 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:57.787423 master-0 kubenswrapper[3979]: E0319 09:17:57.787311 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:57.888623 master-0 kubenswrapper[3979]: E0319 09:17:57.888360 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:57.968362 master-0 kubenswrapper[3979]: I0319 09:17:57.968224 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 09:17:57.969136 master-0 kubenswrapper[3979]: I0319 09:17:57.969072 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"792d2b5907d7be3b52add934725c063cf367a575639846dbd622e4989463bf6d"} Mar 19 09:17:57.969343 master-0 kubenswrapper[3979]: I0319 09:17:57.969306 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:57.970489 master-0 kubenswrapper[3979]: I0319 09:17:57.970441 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:57.970612 master-0 kubenswrapper[3979]: I0319 09:17:57.970502 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:57.970612 master-0 kubenswrapper[3979]: I0319 09:17:57.970522 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:57.989347 master-0 kubenswrapper[3979]: E0319 09:17:57.989285 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:58.089854 master-0 kubenswrapper[3979]: E0319 09:17:58.089714 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:58.190442 master-0 kubenswrapper[3979]: E0319 09:17:58.190264 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:58.291118 master-0 kubenswrapper[3979]: E0319 09:17:58.291048 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:58.392159 master-0 kubenswrapper[3979]: E0319 09:17:58.392041 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:58.493421 master-0 kubenswrapper[3979]: E0319 09:17:58.493253 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:58.594341 master-0 kubenswrapper[3979]: E0319 09:17:58.594238 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:58.695109 master-0 kubenswrapper[3979]: E0319 09:17:58.695009 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:58.795357 master-0 kubenswrapper[3979]: E0319 09:17:58.795236 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:58.896444 master-0 kubenswrapper[3979]: E0319 09:17:58.896361 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:58.997580 master-0 kubenswrapper[3979]: E0319 09:17:58.997425 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:59.098516 master-0 kubenswrapper[3979]: E0319 09:17:59.098300 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:59.199152 master-0 kubenswrapper[3979]: E0319 09:17:59.199077 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:59.300207 master-0 kubenswrapper[3979]: E0319 09:17:59.300075 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:59.401246 master-0 kubenswrapper[3979]: E0319 09:17:59.401047 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:59.502071 master-0 kubenswrapper[3979]: E0319 09:17:59.501975 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:59.602568 master-0 kubenswrapper[3979]: E0319 09:17:59.602445 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:59.703631 master-0 kubenswrapper[3979]: E0319 09:17:59.703355 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:59.804292 master-0 kubenswrapper[3979]: E0319 09:17:59.804194 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:59.905313 master-0 kubenswrapper[3979]: E0319 09:17:59.905148 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:00.006359 master-0 kubenswrapper[3979]: E0319 09:18:00.006192 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:00.106557 master-0 kubenswrapper[3979]: E0319 09:18:00.106446 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:00.206990 master-0 kubenswrapper[3979]: E0319 09:18:00.206900 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:00.307824 master-0 kubenswrapper[3979]: E0319 09:18:00.307719 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:00.408858 master-0 kubenswrapper[3979]: E0319 09:18:00.408751 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:00.509406 master-0 kubenswrapper[3979]: E0319 09:18:00.509311 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:00.610667 master-0 kubenswrapper[3979]: E0319 09:18:00.610436 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:00.710769 master-0 kubenswrapper[3979]: E0319 09:18:00.710648 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:00.811643 master-0 kubenswrapper[3979]: E0319 09:18:00.811561 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:00.913018 master-0 kubenswrapper[3979]: E0319 09:18:00.912792 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:01.013497 master-0 kubenswrapper[3979]: E0319 09:18:01.013381 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:01.114630 master-0 kubenswrapper[3979]: E0319 09:18:01.114505 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:01.215840 master-0 kubenswrapper[3979]: E0319 09:18:01.215661 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:01.316722 master-0 kubenswrapper[3979]: E0319 09:18:01.316636 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:01.417616 master-0 kubenswrapper[3979]: E0319 09:18:01.417476 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:01.518322 master-0 kubenswrapper[3979]: E0319 09:18:01.518270 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:01.619026 master-0 kubenswrapper[3979]: E0319 09:18:01.618920 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:01.719268 master-0 kubenswrapper[3979]: E0319 09:18:01.719202 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:01.819676 master-0 kubenswrapper[3979]: E0319 09:18:01.819509 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:01.920496 master-0 kubenswrapper[3979]: E0319 09:18:01.920392 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:02.021295 master-0 kubenswrapper[3979]: E0319 09:18:02.021191 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:02.122676 master-0 kubenswrapper[3979]: E0319 09:18:02.122389 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:02.223565 master-0 kubenswrapper[3979]: E0319 09:18:02.223425 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:02.324484 master-0 kubenswrapper[3979]: E0319 09:18:02.324354 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:02.425379 master-0 kubenswrapper[3979]: E0319 09:18:02.425215 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:02.525424 master-0 kubenswrapper[3979]: E0319 09:18:02.525328 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:02.625859 master-0 kubenswrapper[3979]: E0319 09:18:02.625785 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:02.726610 master-0 kubenswrapper[3979]: E0319 09:18:02.726357 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:02.827153 master-0 kubenswrapper[3979]: E0319 09:18:02.827024 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:02.928335 master-0 kubenswrapper[3979]: E0319 09:18:02.928149 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:03.029338 master-0 kubenswrapper[3979]: E0319 09:18:03.029206 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:03.129599 master-0 kubenswrapper[3979]: E0319 09:18:03.129418 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:03.230503 master-0 kubenswrapper[3979]: E0319 09:18:03.230377 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:03.331687 master-0 kubenswrapper[3979]: E0319 09:18:03.331415 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:03.432309 master-0 kubenswrapper[3979]: E0319 09:18:03.432194 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:03.533577 master-0 kubenswrapper[3979]: E0319 09:18:03.533435 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:03.634336 master-0 kubenswrapper[3979]: E0319 09:18:03.634153 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:03.734806 master-0 kubenswrapper[3979]: E0319 09:18:03.734693 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:03.835118 master-0 kubenswrapper[3979]: E0319 09:18:03.835026 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:03.936319 master-0 kubenswrapper[3979]: E0319 09:18:03.936080 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:04.036740 master-0 kubenswrapper[3979]: E0319 09:18:04.036644 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:04.137912 master-0 kubenswrapper[3979]: E0319 09:18:04.137809 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:04.238985 master-0 kubenswrapper[3979]: E0319 09:18:04.238818 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:04.340028 master-0 kubenswrapper[3979]: E0319 09:18:04.339893 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:04.440825 master-0 kubenswrapper[3979]: E0319 09:18:04.440745 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:04.541920 master-0 kubenswrapper[3979]: E0319 09:18:04.541820 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:04.642326 master-0 kubenswrapper[3979]: E0319 09:18:04.642249 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:04.743801 master-0 kubenswrapper[3979]: E0319 09:18:04.743348 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:04.844853 master-0 kubenswrapper[3979]: E0319 09:18:04.844688 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:04.945732 master-0 kubenswrapper[3979]: E0319 09:18:04.945648 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:05.046132 master-0 kubenswrapper[3979]: E0319 09:18:05.046029 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:05.146735 master-0 kubenswrapper[3979]: E0319 09:18:05.146551 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:05.247580 master-0 kubenswrapper[3979]: E0319 09:18:05.247473 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:05.348485 master-0 kubenswrapper[3979]: E0319 09:18:05.348388 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:05.448875 master-0 kubenswrapper[3979]: E0319 09:18:05.448673 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:05.549049 master-0 kubenswrapper[3979]: E0319 09:18:05.548966 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:05.649992 master-0 kubenswrapper[3979]: E0319 09:18:05.649887 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:05.750975 master-0 kubenswrapper[3979]: E0319 09:18:05.750781 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:05.753209 master-0 kubenswrapper[3979]: E0319 09:18:05.753141 3979 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:18:05.851457 master-0 kubenswrapper[3979]: E0319 09:18:05.851310 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:05.952255 master-0 kubenswrapper[3979]: E0319 09:18:05.952157 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:06.053380 master-0 kubenswrapper[3979]: E0319 09:18:06.053271 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:06.154223 master-0 kubenswrapper[3979]: E0319 09:18:06.153948 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:06.254962 master-0 kubenswrapper[3979]: E0319 09:18:06.254850 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:06.356175 master-0 kubenswrapper[3979]: E0319 09:18:06.355906 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:06.456222 master-0 kubenswrapper[3979]: E0319 09:18:06.456107 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:06.556723 master-0 kubenswrapper[3979]: E0319 09:18:06.556610 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:06.657157 master-0 kubenswrapper[3979]: E0319 09:18:06.657032 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:06.758143 master-0 kubenswrapper[3979]: E0319 09:18:06.758096 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:06.859188 master-0 kubenswrapper[3979]: E0319 09:18:06.859098 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:06.960428 master-0 kubenswrapper[3979]: E0319 09:18:06.960167 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:07.060573 master-0 kubenswrapper[3979]: E0319 09:18:07.060405 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:07.144663 master-0 kubenswrapper[3979]: E0319 09:18:07.144572 3979 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 19 09:18:07.161353 master-0 kubenswrapper[3979]: E0319 09:18:07.161232 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:07.262564 master-0 kubenswrapper[3979]: E0319 09:18:07.262363 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:07.363131 master-0 kubenswrapper[3979]: E0319 09:18:07.362998 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:07.463449 master-0 kubenswrapper[3979]: E0319 09:18:07.463373 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:07.563558 master-0 kubenswrapper[3979]: E0319 09:18:07.563466 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:07.664275 master-0 kubenswrapper[3979]: E0319 09:18:07.664196 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:07.765350 master-0 kubenswrapper[3979]: E0319 09:18:07.765253 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:07.865953 master-0 kubenswrapper[3979]: E0319 09:18:07.865717 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:07.966655 master-0 kubenswrapper[3979]: E0319 09:18:07.966578 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:08.067800 master-0 kubenswrapper[3979]: E0319 09:18:08.067675 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:08.169158 master-0 kubenswrapper[3979]: E0319 09:18:08.168965 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:08.270305 master-0 kubenswrapper[3979]: E0319 09:18:08.270225 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:08.371123 master-0 kubenswrapper[3979]: E0319 09:18:08.370996 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:08.471959 master-0 kubenswrapper[3979]: E0319 09:18:08.471755 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:08.572097 master-0 kubenswrapper[3979]: E0319 09:18:08.571991 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:08.672567 master-0 kubenswrapper[3979]: E0319 09:18:08.672439 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:08.773680 master-0 kubenswrapper[3979]: E0319 09:18:08.773608 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:08.874870 master-0 kubenswrapper[3979]: E0319 09:18:08.874771 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:08.975861 master-0 kubenswrapper[3979]: E0319 09:18:08.975706 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:09.076433 master-0 kubenswrapper[3979]: E0319 09:18:09.076218 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:09.176886 master-0 kubenswrapper[3979]: E0319 09:18:09.176753 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:09.278084 master-0 kubenswrapper[3979]: E0319 09:18:09.277955 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:09.379204 master-0 kubenswrapper[3979]: E0319 09:18:09.378963 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:09.480014 master-0 kubenswrapper[3979]: E0319 09:18:09.479901 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:09.580804 master-0 kubenswrapper[3979]: E0319 09:18:09.580703 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:09.681766 master-0 kubenswrapper[3979]: E0319 09:18:09.681566 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:09.782676 master-0 kubenswrapper[3979]: E0319 09:18:09.782598 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:09.883207 master-0 kubenswrapper[3979]: E0319 09:18:09.883133 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:09.984070 master-0 kubenswrapper[3979]: E0319 09:18:09.983863 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:10.085121 master-0 kubenswrapper[3979]: E0319 09:18:10.084943 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:10.185677 master-0 kubenswrapper[3979]: E0319 09:18:10.185580 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:10.286441 master-0 kubenswrapper[3979]: E0319 09:18:10.286334 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:10.387374 master-0 kubenswrapper[3979]: E0319 09:18:10.387274 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:10.488388 master-0 kubenswrapper[3979]: E0319 09:18:10.488299 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:10.589016 master-0 kubenswrapper[3979]: E0319 09:18:10.588837 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:10.689903 master-0 kubenswrapper[3979]: E0319 09:18:10.689809 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:10.790348 master-0 kubenswrapper[3979]: E0319 09:18:10.790259 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:10.890979 master-0 kubenswrapper[3979]: E0319 09:18:10.890817 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:10.991761 master-0 kubenswrapper[3979]: E0319 09:18:10.991660 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:11.092824 master-0 kubenswrapper[3979]: E0319 09:18:11.092718 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:11.193087 master-0 kubenswrapper[3979]: E0319 09:18:11.192887 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:11.293781 master-0 kubenswrapper[3979]: E0319 09:18:11.293677 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:11.394477 master-0 kubenswrapper[3979]: E0319 09:18:11.394394 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:11.495170 master-0 kubenswrapper[3979]: E0319 09:18:11.494998 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:11.596134 master-0 kubenswrapper[3979]: E0319 09:18:11.596046 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:11.696445 master-0 kubenswrapper[3979]: E0319 09:18:11.696351 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:11.797567 master-0 kubenswrapper[3979]: E0319 09:18:11.797476 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:11.898410 master-0 kubenswrapper[3979]: E0319 09:18:11.898263 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:11.999584 master-0 kubenswrapper[3979]: E0319 09:18:11.999432 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:12.100786 master-0 kubenswrapper[3979]: E0319 09:18:12.100454 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:12.201725 master-0 kubenswrapper[3979]: E0319 09:18:12.201620 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:12.302490 master-0 kubenswrapper[3979]: E0319 09:18:12.302410 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:12.403663 master-0 kubenswrapper[3979]: E0319 09:18:12.403418 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:12.503975 master-0 kubenswrapper[3979]: E0319 09:18:12.503897 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:12.604228 master-0 kubenswrapper[3979]: E0319 09:18:12.604101 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:12.705039 master-0 kubenswrapper[3979]: E0319 09:18:12.704842 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:12.805106 master-0 kubenswrapper[3979]: E0319 09:18:12.805035 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:12.906070 master-0 kubenswrapper[3979]: E0319 09:18:12.905946 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:13.007029 master-0 kubenswrapper[3979]: E0319 09:18:13.006782 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:13.107344 master-0 kubenswrapper[3979]: E0319 09:18:13.107252 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:13.208401 master-0 kubenswrapper[3979]: E0319 09:18:13.208274 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:13.309270 master-0 kubenswrapper[3979]: E0319 09:18:13.309143 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:13.410319 master-0 kubenswrapper[3979]: E0319 09:18:13.410235 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:13.511437 master-0 kubenswrapper[3979]: E0319 09:18:13.511313 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:13.612508 master-0 kubenswrapper[3979]: E0319 09:18:13.612338 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:13.712716 master-0 kubenswrapper[3979]: E0319 09:18:13.712615 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:13.813296 master-0 kubenswrapper[3979]: E0319 09:18:13.813142 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:13.914287 master-0 kubenswrapper[3979]: E0319 09:18:13.913986 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:14.014408 master-0 kubenswrapper[3979]: E0319 09:18:14.014296 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:14.114845 master-0 kubenswrapper[3979]: E0319 09:18:14.114758 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:14.215873 master-0 kubenswrapper[3979]: E0319 09:18:14.215615 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:14.316871 master-0 kubenswrapper[3979]: E0319 09:18:14.316753 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:14.417747 master-0 kubenswrapper[3979]: E0319 09:18:14.417666 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:14.518643 master-0 kubenswrapper[3979]: E0319 09:18:14.518522 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:14.619074 master-0 kubenswrapper[3979]: E0319 09:18:14.619008 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:14.719370 master-0 kubenswrapper[3979]: E0319 09:18:14.719271 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:14.820086 master-0 kubenswrapper[3979]: E0319 09:18:14.819867 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:14.920874 master-0 kubenswrapper[3979]: E0319 09:18:14.920753 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:15.021927 master-0 kubenswrapper[3979]: E0319 09:18:15.021843 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:15.122483 master-0 kubenswrapper[3979]: E0319 09:18:15.122330 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:15.223408 master-0 kubenswrapper[3979]: E0319 09:18:15.223298 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:15.324490 master-0 kubenswrapper[3979]: E0319 09:18:15.324382 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:15.425746 master-0 kubenswrapper[3979]: E0319 09:18:15.425491 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:15.526053 master-0 kubenswrapper[3979]: E0319 09:18:15.525939 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:15.627267 master-0 kubenswrapper[3979]: E0319 09:18:15.627153 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:15.728376 master-0 kubenswrapper[3979]: E0319 09:18:15.728213 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:15.753728 master-0 kubenswrapper[3979]: E0319 09:18:15.753614 3979 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:18:15.828891 master-0 kubenswrapper[3979]: E0319 09:18:15.828789 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:15.929692 master-0 kubenswrapper[3979]: E0319 09:18:15.929581 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:16.030107 master-0 kubenswrapper[3979]: E0319 09:18:16.029949 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:16.131115 master-0 kubenswrapper[3979]: E0319 09:18:16.131043 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:16.232042 master-0 kubenswrapper[3979]: E0319 09:18:16.231899 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:16.332658 master-0 kubenswrapper[3979]: E0319 09:18:16.332403 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:16.433369 master-0 kubenswrapper[3979]: E0319 09:18:16.433255 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:16.534512 master-0 kubenswrapper[3979]: E0319 09:18:16.534406 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:16.635726 master-0 kubenswrapper[3979]: E0319 09:18:16.635500 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:16.736235 master-0 kubenswrapper[3979]: E0319 09:18:16.736114 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:16.837022 master-0 kubenswrapper[3979]: E0319 09:18:16.836945 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:16.938114 master-0 kubenswrapper[3979]: E0319 09:18:16.937928 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:17.038265 master-0 kubenswrapper[3979]: E0319 09:18:17.038163 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:17.139096 master-0 kubenswrapper[3979]: E0319 09:18:17.139014 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:17.239885 master-0 kubenswrapper[3979]: E0319 09:18:17.239721 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:17.278170 master-0 kubenswrapper[3979]: E0319 09:18:17.278053 3979 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 19 09:18:17.340885 master-0 kubenswrapper[3979]: E0319 09:18:17.340823 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:17.441419 master-0 kubenswrapper[3979]: E0319 09:18:17.441333 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:17.542311 master-0 kubenswrapper[3979]: E0319 09:18:17.542244 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:17.643462 master-0 kubenswrapper[3979]: E0319 09:18:17.643387 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:17.744647 master-0 kubenswrapper[3979]: E0319 09:18:17.744484 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:17.845821 master-0 kubenswrapper[3979]: E0319 09:18:17.845620 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:17.946863 master-0 kubenswrapper[3979]: E0319 09:18:17.946757 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:18.047766 master-0 kubenswrapper[3979]: E0319 09:18:18.047649 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:18.148770 master-0 kubenswrapper[3979]: E0319 09:18:18.148584 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:18.249469 master-0 kubenswrapper[3979]: E0319 09:18:18.249353 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:18.350235 master-0 kubenswrapper[3979]: E0319 09:18:18.350150 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:18.451419 master-0 kubenswrapper[3979]: E0319 09:18:18.451265 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:18.552452 master-0 kubenswrapper[3979]: E0319 09:18:18.552351 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:18.653074 master-0 kubenswrapper[3979]: E0319 09:18:18.652955 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:18.753756 master-0 kubenswrapper[3979]: E0319 09:18:18.753466 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:18.854067 master-0 kubenswrapper[3979]: E0319 09:18:18.853917 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:18.954709 master-0 kubenswrapper[3979]: E0319 09:18:18.954607 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:19.055227 master-0 kubenswrapper[3979]: E0319 09:18:19.055135 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:19.155669 master-0 kubenswrapper[3979]: E0319 09:18:19.155593 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:19.256726 master-0 kubenswrapper[3979]: E0319 09:18:19.256602 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:19.357628 master-0 kubenswrapper[3979]: E0319 09:18:19.357383 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:19.459231 master-0 kubenswrapper[3979]: E0319 09:18:19.459119 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:19.559721 master-0 kubenswrapper[3979]: E0319 09:18:19.559640 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:19.659970 master-0 kubenswrapper[3979]: E0319 09:18:19.659786 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:19.760001 master-0 kubenswrapper[3979]: E0319 09:18:19.759916 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:19.861141 master-0 kubenswrapper[3979]: E0319 09:18:19.861065 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:19.962035 master-0 kubenswrapper[3979]: E0319 09:18:19.961724 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:20.062351 master-0 kubenswrapper[3979]: E0319 09:18:20.062289 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:20.163142 master-0 kubenswrapper[3979]: E0319 09:18:20.163078 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:20.264133 master-0 kubenswrapper[3979]: E0319 09:18:20.263926 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:20.364815 master-0 kubenswrapper[3979]: E0319 09:18:20.364757 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:20.465810 master-0 kubenswrapper[3979]: E0319 09:18:20.465716 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:20.566867 master-0 kubenswrapper[3979]: E0319 09:18:20.566818 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:20.667813 master-0 kubenswrapper[3979]: E0319 09:18:20.667735 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:20.768025 master-0 kubenswrapper[3979]: E0319 09:18:20.767980 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:20.782815 master-0 kubenswrapper[3979]: I0319 09:18:20.782751 3979 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:18:20.783891 master-0 kubenswrapper[3979]: I0319 09:18:20.783854 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:18:20.784026 master-0 kubenswrapper[3979]: I0319 09:18:20.783911 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:18:20.784026 master-0 kubenswrapper[3979]: I0319 09:18:20.783930 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:18:20.868457 master-0 kubenswrapper[3979]: E0319 09:18:20.868269 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:20.969137 master-0 kubenswrapper[3979]: E0319 09:18:20.969020 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:21.069446 master-0 kubenswrapper[3979]: E0319 09:18:21.069348 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:21.169618 master-0 kubenswrapper[3979]: E0319 09:18:21.169480 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:21.270417 master-0 kubenswrapper[3979]: E0319 09:18:21.270278 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:21.371602 master-0 kubenswrapper[3979]: E0319 09:18:21.371491 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:21.472509 master-0 kubenswrapper[3979]: E0319 09:18:21.472305 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:21.573716 master-0 kubenswrapper[3979]: E0319 09:18:21.573634 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:21.674890 master-0 kubenswrapper[3979]: E0319 09:18:21.674818 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:21.776015 master-0 kubenswrapper[3979]: E0319 09:18:21.775922 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:21.877252 master-0 kubenswrapper[3979]: E0319 09:18:21.877097 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:21.977619 master-0 kubenswrapper[3979]: E0319 09:18:21.977558 3979 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:18:22.011127 master-0 kubenswrapper[3979]: I0319 09:18:22.011067 3979 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:18:22.689223 master-0 kubenswrapper[3979]: I0319 09:18:22.689000 3979 apiserver.go:52] "Watching apiserver" Mar 19 09:18:22.695654 master-0 kubenswrapper[3979]: I0319 09:18:22.695497 3979 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:18:22.695953 master-0 kubenswrapper[3979]: I0319 09:18:22.695888 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-kwrpk","openshift-cluster-version/cluster-version-operator-56d8475767-prd2q","openshift-network-operator/network-operator-7bd846bfc4-b4d28"] Mar 19 09:18:22.696388 master-0 kubenswrapper[3979]: I0319 09:18:22.696296 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:18:22.696388 master-0 kubenswrapper[3979]: I0319 09:18:22.696362 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.696693 master-0 kubenswrapper[3979]: I0319 09:18:22.696361 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.699472 master-0 kubenswrapper[3979]: I0319 09:18:22.699125 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:18:22.700003 master-0 kubenswrapper[3979]: I0319 09:18:22.699820 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:18:22.702682 master-0 kubenswrapper[3979]: I0319 09:18:22.702175 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:18:22.702682 master-0 kubenswrapper[3979]: I0319 09:18:22.702255 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:18:22.702682 master-0 kubenswrapper[3979]: I0319 09:18:22.702298 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:18:22.703598 master-0 kubenswrapper[3979]: I0319 09:18:22.703051 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 19 09:18:22.703598 master-0 kubenswrapper[3979]: I0319 09:18:22.703094 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Mar 19 09:18:22.703598 master-0 kubenswrapper[3979]: I0319 09:18:22.703369 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:18:22.703745 master-0 kubenswrapper[3979]: I0319 09:18:22.703611 3979 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 19 09:18:22.703905 master-0 kubenswrapper[3979]: I0319 09:18:22.703853 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Mar 19 09:18:22.723899 master-0 kubenswrapper[3979]: I0319 09:18:22.723649 3979 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 19 09:18:22.774041 master-0 kubenswrapper[3979]: I0319 09:18:22.773988 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-sno-bootstrap-files\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.774041 master-0 kubenswrapper[3979]: I0319 09:18:22.774034 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.774298 master-0 kubenswrapper[3979]: I0319 09:18:22.774055 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.774298 master-0 kubenswrapper[3979]: I0319 09:18:22.774071 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hnvh\" (UniqueName: \"kubernetes.io/projected/4abcf2ea-50f5-4d62-8a23-583438e5b451-kube-api-access-2hnvh\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:18:22.774298 master-0 kubenswrapper[3979]: I0319 09:18:22.774090 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-resolv-conf\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.774298 master-0 kubenswrapper[3979]: I0319 09:18:22.774110 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl5w2\" (UniqueName: \"kubernetes.io/projected/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-kube-api-access-gl5w2\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.774298 master-0 kubenswrapper[3979]: I0319 09:18:22.774132 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-ca-bundle\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.774298 master-0 kubenswrapper[3979]: I0319 09:18:22.774150 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51b88818-5108-40db-90c8-4f2e7198959e-service-ca\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.774298 master-0 kubenswrapper[3979]: I0319 09:18:22.774184 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.774298 master-0 kubenswrapper[3979]: I0319 09:18:22.774201 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51b88818-5108-40db-90c8-4f2e7198959e-kube-api-access\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.774298 master-0 kubenswrapper[3979]: I0319 09:18:22.774269 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4abcf2ea-50f5-4d62-8a23-583438e5b451-metrics-tls\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:18:22.774731 master-0 kubenswrapper[3979]: I0319 09:18:22.774305 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-var-run-resolv-conf\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.774731 master-0 kubenswrapper[3979]: I0319 09:18:22.774329 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/4abcf2ea-50f5-4d62-8a23-583438e5b451-host-etc-kube\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:18:22.875192 master-0 kubenswrapper[3979]: I0319 09:18:22.875073 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4abcf2ea-50f5-4d62-8a23-583438e5b451-metrics-tls\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:18:22.875583 master-0 kubenswrapper[3979]: I0319 09:18:22.875239 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-var-run-resolv-conf\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.875583 master-0 kubenswrapper[3979]: I0319 09:18:22.875283 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/4abcf2ea-50f5-4d62-8a23-583438e5b451-host-etc-kube\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:18:22.875583 master-0 kubenswrapper[3979]: I0319 09:18:22.875312 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-resolv-conf\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.875583 master-0 kubenswrapper[3979]: I0319 09:18:22.875355 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-sno-bootstrap-files\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.875583 master-0 kubenswrapper[3979]: I0319 09:18:22.875428 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-sno-bootstrap-files\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.875583 master-0 kubenswrapper[3979]: I0319 09:18:22.875432 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-resolv-conf\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.875583 master-0 kubenswrapper[3979]: I0319 09:18:22.875461 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-var-run-resolv-conf\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.875583 master-0 kubenswrapper[3979]: I0319 09:18:22.875431 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/4abcf2ea-50f5-4d62-8a23-583438e5b451-host-etc-kube\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:18:22.875583 master-0 kubenswrapper[3979]: I0319 09:18:22.875487 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.875583 master-0 kubenswrapper[3979]: I0319 09:18:22.875553 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.875583 master-0 kubenswrapper[3979]: I0319 09:18:22.875576 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.876320 master-0 kubenswrapper[3979]: I0319 09:18:22.875614 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hnvh\" (UniqueName: \"kubernetes.io/projected/4abcf2ea-50f5-4d62-8a23-583438e5b451-kube-api-access-2hnvh\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:18:22.876320 master-0 kubenswrapper[3979]: I0319 09:18:22.875650 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl5w2\" (UniqueName: \"kubernetes.io/projected/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-kube-api-access-gl5w2\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.876320 master-0 kubenswrapper[3979]: I0319 09:18:22.876087 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-ca-bundle\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.876320 master-0 kubenswrapper[3979]: I0319 09:18:22.876130 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51b88818-5108-40db-90c8-4f2e7198959e-service-ca\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.876320 master-0 kubenswrapper[3979]: I0319 09:18:22.876167 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.876320 master-0 kubenswrapper[3979]: I0319 09:18:22.876200 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51b88818-5108-40db-90c8-4f2e7198959e-kube-api-access\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.876320 master-0 kubenswrapper[3979]: I0319 09:18:22.876280 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-ca-bundle\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.876720 master-0 kubenswrapper[3979]: I0319 09:18:22.876347 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.877744 master-0 kubenswrapper[3979]: I0319 09:18:22.877693 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51b88818-5108-40db-90c8-4f2e7198959e-service-ca\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:22.878426 master-0 kubenswrapper[3979]: E0319 09:18:22.878320 3979 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:22.878592 master-0 kubenswrapper[3979]: E0319 09:18:22.878561 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:18:23.37849275 +0000 UTC m=+78.421480528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:22.879806 master-0 kubenswrapper[3979]: I0319 09:18:22.879717 3979 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 09:18:22.885918 master-0 kubenswrapper[3979]: I0319 09:18:22.885865 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4abcf2ea-50f5-4d62-8a23-583438e5b451-metrics-tls\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:18:22.894690 master-0 kubenswrapper[3979]: I0319 09:18:22.894600 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hnvh\" (UniqueName: \"kubernetes.io/projected/4abcf2ea-50f5-4d62-8a23-583438e5b451-kube-api-access-2hnvh\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:18:22.896012 master-0 kubenswrapper[3979]: I0319 09:18:22.895967 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl5w2\" (UniqueName: \"kubernetes.io/projected/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-kube-api-access-gl5w2\") pod \"assisted-installer-controller-kwrpk\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:22.896635 master-0 kubenswrapper[3979]: I0319 09:18:22.896617 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51b88818-5108-40db-90c8-4f2e7198959e-kube-api-access\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:23.014666 master-0 kubenswrapper[3979]: I0319 09:18:23.014561 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:18:23.037844 master-0 kubenswrapper[3979]: W0319 09:18:23.037503 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4abcf2ea_50f5_4d62_8a23_583438e5b451.slice/crio-8f416a5a9dd7c863825858501cc1e0dcef058160b507e9a5e5d82fab9e9dd0c1 WatchSource:0}: Error finding container 8f416a5a9dd7c863825858501cc1e0dcef058160b507e9a5e5d82fab9e9dd0c1: Status 404 returned error can't find the container with id 8f416a5a9dd7c863825858501cc1e0dcef058160b507e9a5e5d82fab9e9dd0c1 Mar 19 09:18:23.040764 master-0 kubenswrapper[3979]: I0319 09:18:23.039982 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:23.048045 master-0 kubenswrapper[3979]: I0319 09:18:23.047457 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" event={"ID":"4abcf2ea-50f5-4d62-8a23-583438e5b451","Type":"ContainerStarted","Data":"8f416a5a9dd7c863825858501cc1e0dcef058160b507e9a5e5d82fab9e9dd0c1"} Mar 19 09:18:23.060109 master-0 kubenswrapper[3979]: W0319 09:18:23.059980 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84e1a860_b3b0_4f3e_ac3d_9f4e40429ae9.slice/crio-3ee23309cc49a7d14cc3f6a92bd46ad644b4ffd9ef72d1521784109d325534ba WatchSource:0}: Error finding container 3ee23309cc49a7d14cc3f6a92bd46ad644b4ffd9ef72d1521784109d325534ba: Status 404 returned error can't find the container with id 3ee23309cc49a7d14cc3f6a92bd46ad644b4ffd9ef72d1521784109d325534ba Mar 19 09:18:23.381143 master-0 kubenswrapper[3979]: I0319 09:18:23.380924 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:23.381395 master-0 kubenswrapper[3979]: E0319 09:18:23.381188 3979 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:23.381395 master-0 kubenswrapper[3979]: E0319 09:18:23.381348 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:18:24.381302664 +0000 UTC m=+79.424290282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:24.050306 master-0 kubenswrapper[3979]: I0319 09:18:24.050231 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-kwrpk" event={"ID":"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9","Type":"ContainerStarted","Data":"3ee23309cc49a7d14cc3f6a92bd46ad644b4ffd9ef72d1521784109d325534ba"} Mar 19 09:18:24.388679 master-0 kubenswrapper[3979]: I0319 09:18:24.388428 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:24.388950 master-0 kubenswrapper[3979]: E0319 09:18:24.388685 3979 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:24.388950 master-0 kubenswrapper[3979]: E0319 09:18:24.388791 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:18:26.388765792 +0000 UTC m=+81.431753360 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:26.405329 master-0 kubenswrapper[3979]: I0319 09:18:26.405247 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:26.405846 master-0 kubenswrapper[3979]: E0319 09:18:26.405381 3979 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:26.405846 master-0 kubenswrapper[3979]: E0319 09:18:26.405444 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:18:30.405423989 +0000 UTC m=+85.448411567 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:26.827749 master-0 kubenswrapper[3979]: W0319 09:18:26.827679 3979 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 19 09:18:26.827976 master-0 kubenswrapper[3979]: I0319 09:18:26.827887 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 09:18:29.066315 master-0 kubenswrapper[3979]: I0319 09:18:29.066229 3979 generic.go:334] "Generic (PLEG): container finished" podID="84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" containerID="1e814e1f8603ada52f29b21f78df17d1b4dc0c1bc66fb422a5b77d8e27ae2d59" exitCode=0 Mar 19 09:18:29.067175 master-0 kubenswrapper[3979]: I0319 09:18:29.066336 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-kwrpk" event={"ID":"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9","Type":"ContainerDied","Data":"1e814e1f8603ada52f29b21f78df17d1b4dc0c1bc66fb422a5b77d8e27ae2d59"} Mar 19 09:18:29.068365 master-0 kubenswrapper[3979]: I0319 09:18:29.068282 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" event={"ID":"4abcf2ea-50f5-4d62-8a23-583438e5b451","Type":"ContainerStarted","Data":"0ca36c4228886afdfb6b80a61e2423dd76188b00985839f0f0ad53c1f5d31db7"} Mar 19 09:18:29.149188 master-0 kubenswrapper[3979]: I0319 09:18:29.148971 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=3.1489422129999998 podStartE2EDuration="3.148942213s" podCreationTimestamp="2026-03-19 09:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:18:29.148505942 +0000 UTC m=+84.191493530" watchObservedRunningTime="2026-03-19 09:18:29.148942213 +0000 UTC m=+84.191929811" Mar 19 09:18:30.091220 master-0 kubenswrapper[3979]: I0319 09:18:30.091170 3979 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:30.236344 master-0 kubenswrapper[3979]: I0319 09:18:30.236256 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl5w2\" (UniqueName: \"kubernetes.io/projected/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-kube-api-access-gl5w2\") pod \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " Mar 19 09:18:30.236344 master-0 kubenswrapper[3979]: I0319 09:18:30.236301 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-resolv-conf\") pod \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " Mar 19 09:18:30.236344 master-0 kubenswrapper[3979]: I0319 09:18:30.236325 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-ca-bundle\") pod \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " Mar 19 09:18:30.236344 master-0 kubenswrapper[3979]: I0319 09:18:30.236350 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-sno-bootstrap-files\") pod \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " Mar 19 09:18:30.236731 master-0 kubenswrapper[3979]: I0319 09:18:30.236374 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-var-run-resolv-conf\") pod \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\" (UID: \"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9\") " Mar 19 09:18:30.236731 master-0 kubenswrapper[3979]: I0319 09:18:30.236451 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" (UID: "84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:18:30.236731 master-0 kubenswrapper[3979]: I0319 09:18:30.236488 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" (UID: "84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:18:30.236731 master-0 kubenswrapper[3979]: I0319 09:18:30.236520 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" (UID: "84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:18:30.236731 master-0 kubenswrapper[3979]: I0319 09:18:30.236497 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" (UID: "84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:18:30.240395 master-0 kubenswrapper[3979]: I0319 09:18:30.240305 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-kube-api-access-gl5w2" (OuterVolumeSpecName: "kube-api-access-gl5w2") pod "84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" (UID: "84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9"). InnerVolumeSpecName "kube-api-access-gl5w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:18:30.337670 master-0 kubenswrapper[3979]: I0319 09:18:30.337482 3979 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl5w2\" (UniqueName: \"kubernetes.io/projected/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-kube-api-access-gl5w2\") on node \"master-0\" DevicePath \"\"" Mar 19 09:18:30.337670 master-0 kubenswrapper[3979]: I0319 09:18:30.337575 3979 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:18:30.337670 master-0 kubenswrapper[3979]: I0319 09:18:30.337597 3979 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:18:30.337670 master-0 kubenswrapper[3979]: I0319 09:18:30.337616 3979 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Mar 19 09:18:30.337670 master-0 kubenswrapper[3979]: I0319 09:18:30.337635 3979 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:18:30.438261 master-0 kubenswrapper[3979]: I0319 09:18:30.438158 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:30.438603 master-0 kubenswrapper[3979]: E0319 09:18:30.438333 3979 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:30.438603 master-0 kubenswrapper[3979]: E0319 09:18:30.438425 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:18:38.438397103 +0000 UTC m=+93.481384721 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:31.077761 master-0 kubenswrapper[3979]: I0319 09:18:31.077614 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-kwrpk" event={"ID":"84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9","Type":"ContainerDied","Data":"3ee23309cc49a7d14cc3f6a92bd46ad644b4ffd9ef72d1521784109d325534ba"} Mar 19 09:18:31.077761 master-0 kubenswrapper[3979]: I0319 09:18:31.077734 3979 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ee23309cc49a7d14cc3f6a92bd46ad644b4ffd9ef72d1521784109d325534ba" Mar 19 09:18:31.078149 master-0 kubenswrapper[3979]: I0319 09:18:31.077833 3979 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:18:32.091276 master-0 kubenswrapper[3979]: I0319 09:18:32.091181 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" podStartSLOduration=39.043918666 podStartE2EDuration="44.091156906s" podCreationTimestamp="2026-03-19 09:17:48 +0000 UTC" firstStartedPulling="2026-03-19 09:18:23.040824392 +0000 UTC m=+78.083811980" lastFinishedPulling="2026-03-19 09:18:28.088062602 +0000 UTC m=+83.131050220" observedRunningTime="2026-03-19 09:18:29.21877896 +0000 UTC m=+84.261766578" watchObservedRunningTime="2026-03-19 09:18:32.091156906 +0000 UTC m=+87.134144484" Mar 19 09:18:33.724402 master-0 kubenswrapper[3979]: I0319 09:18:33.724314 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-mhc5l"] Mar 19 09:18:33.724402 master-0 kubenswrapper[3979]: E0319 09:18:33.724414 3979 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" containerName="assisted-installer-controller" Mar 19 09:18:33.725492 master-0 kubenswrapper[3979]: I0319 09:18:33.724427 3979 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" containerName="assisted-installer-controller" Mar 19 09:18:33.725492 master-0 kubenswrapper[3979]: I0319 09:18:33.724453 3979 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" containerName="assisted-installer-controller" Mar 19 09:18:33.725492 master-0 kubenswrapper[3979]: I0319 09:18:33.724665 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-mhc5l" Mar 19 09:18:33.866158 master-0 kubenswrapper[3979]: I0319 09:18:33.866067 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4s9m\" (UniqueName: \"kubernetes.io/projected/5ebf851a-172c-4f6d-9b72-9ae8afa5e950-kube-api-access-w4s9m\") pod \"mtu-prober-mhc5l\" (UID: \"5ebf851a-172c-4f6d-9b72-9ae8afa5e950\") " pod="openshift-network-operator/mtu-prober-mhc5l" Mar 19 09:18:33.967063 master-0 kubenswrapper[3979]: I0319 09:18:33.966959 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4s9m\" (UniqueName: \"kubernetes.io/projected/5ebf851a-172c-4f6d-9b72-9ae8afa5e950-kube-api-access-w4s9m\") pod \"mtu-prober-mhc5l\" (UID: \"5ebf851a-172c-4f6d-9b72-9ae8afa5e950\") " pod="openshift-network-operator/mtu-prober-mhc5l" Mar 19 09:18:33.988495 master-0 kubenswrapper[3979]: I0319 09:18:33.988237 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4s9m\" (UniqueName: \"kubernetes.io/projected/5ebf851a-172c-4f6d-9b72-9ae8afa5e950-kube-api-access-w4s9m\") pod \"mtu-prober-mhc5l\" (UID: \"5ebf851a-172c-4f6d-9b72-9ae8afa5e950\") " pod="openshift-network-operator/mtu-prober-mhc5l" Mar 19 09:18:34.036670 master-0 kubenswrapper[3979]: I0319 09:18:34.036582 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-mhc5l" Mar 19 09:18:34.095059 master-0 kubenswrapper[3979]: I0319 09:18:34.094819 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-mhc5l" event={"ID":"5ebf851a-172c-4f6d-9b72-9ae8afa5e950","Type":"ContainerStarted","Data":"ea06326f75dbe8dd7c60652c7838fe0eb8d997984652bd4f5b739f7370b57187"} Mar 19 09:18:35.100276 master-0 kubenswrapper[3979]: I0319 09:18:35.100153 3979 generic.go:334] "Generic (PLEG): container finished" podID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" containerID="d486a2c521f4c2c3eb232b1929f8a1ec255878f2382227f7f128e10063843ecc" exitCode=0 Mar 19 09:18:35.100276 master-0 kubenswrapper[3979]: I0319 09:18:35.100214 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-mhc5l" event={"ID":"5ebf851a-172c-4f6d-9b72-9ae8afa5e950","Type":"ContainerDied","Data":"d486a2c521f4c2c3eb232b1929f8a1ec255878f2382227f7f128e10063843ecc"} Mar 19 09:18:36.120384 master-0 kubenswrapper[3979]: I0319 09:18:36.120350 3979 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-mhc5l" Mar 19 09:18:36.285663 master-0 kubenswrapper[3979]: I0319 09:18:36.285584 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4s9m\" (UniqueName: \"kubernetes.io/projected/5ebf851a-172c-4f6d-9b72-9ae8afa5e950-kube-api-access-w4s9m\") pod \"5ebf851a-172c-4f6d-9b72-9ae8afa5e950\" (UID: \"5ebf851a-172c-4f6d-9b72-9ae8afa5e950\") " Mar 19 09:18:36.291021 master-0 kubenswrapper[3979]: I0319 09:18:36.290944 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ebf851a-172c-4f6d-9b72-9ae8afa5e950-kube-api-access-w4s9m" (OuterVolumeSpecName: "kube-api-access-w4s9m") pod "5ebf851a-172c-4f6d-9b72-9ae8afa5e950" (UID: "5ebf851a-172c-4f6d-9b72-9ae8afa5e950"). InnerVolumeSpecName "kube-api-access-w4s9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:18:36.386152 master-0 kubenswrapper[3979]: I0319 09:18:36.385960 3979 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4s9m\" (UniqueName: \"kubernetes.io/projected/5ebf851a-172c-4f6d-9b72-9ae8afa5e950-kube-api-access-w4s9m\") on node \"master-0\" DevicePath \"\"" Mar 19 09:18:37.110621 master-0 kubenswrapper[3979]: I0319 09:18:37.110501 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-mhc5l" event={"ID":"5ebf851a-172c-4f6d-9b72-9ae8afa5e950","Type":"ContainerDied","Data":"ea06326f75dbe8dd7c60652c7838fe0eb8d997984652bd4f5b739f7370b57187"} Mar 19 09:18:37.110621 master-0 kubenswrapper[3979]: I0319 09:18:37.110592 3979 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea06326f75dbe8dd7c60652c7838fe0eb8d997984652bd4f5b739f7370b57187" Mar 19 09:18:37.110995 master-0 kubenswrapper[3979]: I0319 09:18:37.110661 3979 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-mhc5l" Mar 19 09:18:38.502336 master-0 kubenswrapper[3979]: I0319 09:18:38.502241 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:38.502966 master-0 kubenswrapper[3979]: E0319 09:18:38.502472 3979 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:38.502966 master-0 kubenswrapper[3979]: E0319 09:18:38.502625 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:18:54.502596169 +0000 UTC m=+109.545583777 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:38.733185 master-0 kubenswrapper[3979]: I0319 09:18:38.732831 3979 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-mhc5l"] Mar 19 09:18:38.738886 master-0 kubenswrapper[3979]: I0319 09:18:38.738821 3979 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-mhc5l"] Mar 19 09:18:39.789972 master-0 kubenswrapper[3979]: I0319 09:18:39.789900 3979 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" path="/var/lib/kubelet/pods/5ebf851a-172c-4f6d-9b72-9ae8afa5e950/volumes" Mar 19 09:18:43.615317 master-0 kubenswrapper[3979]: I0319 09:18:43.615213 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bzdzd"] Mar 19 09:18:43.616096 master-0 kubenswrapper[3979]: E0319 09:18:43.615827 3979 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" containerName="prober" Mar 19 09:18:43.616096 master-0 kubenswrapper[3979]: I0319 09:18:43.615843 3979 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" containerName="prober" Mar 19 09:18:43.616096 master-0 kubenswrapper[3979]: I0319 09:18:43.615867 3979 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" containerName="prober" Mar 19 09:18:43.616096 master-0 kubenswrapper[3979]: I0319 09:18:43.616063 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.618183 master-0 kubenswrapper[3979]: I0319 09:18:43.618145 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:18:43.621780 master-0 kubenswrapper[3979]: I0319 09:18:43.621711 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:18:43.621885 master-0 kubenswrapper[3979]: I0319 09:18:43.621780 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:18:43.623784 master-0 kubenswrapper[3979]: I0319 09:18:43.622781 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:18:43.745030 master-0 kubenswrapper[3979]: I0319 09:18:43.744920 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-os-release\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745030 master-0 kubenswrapper[3979]: I0319 09:18:43.745001 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-multus\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745030 master-0 kubenswrapper[3979]: I0319 09:18:43.745027 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-multus-certs\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745428 master-0 kubenswrapper[3979]: I0319 09:18:43.745149 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-etc-kubernetes\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745428 master-0 kubenswrapper[3979]: I0319 09:18:43.745204 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhzsr\" (UniqueName: \"kubernetes.io/projected/157e3524-eb27-41ca-b49d-2697ee1245ca-kube-api-access-qhzsr\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745428 master-0 kubenswrapper[3979]: I0319 09:18:43.745230 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-system-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745428 master-0 kubenswrapper[3979]: I0319 09:18:43.745310 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-cnibin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745428 master-0 kubenswrapper[3979]: I0319 09:18:43.745366 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-k8s-cni-cncf-io\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745428 master-0 kubenswrapper[3979]: I0319 09:18:43.745391 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-kubelet\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745798 master-0 kubenswrapper[3979]: I0319 09:18:43.745445 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-hostroot\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745798 master-0 kubenswrapper[3979]: I0319 09:18:43.745516 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-daemon-config\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745798 master-0 kubenswrapper[3979]: I0319 09:18:43.745575 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-conf-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745798 master-0 kubenswrapper[3979]: I0319 09:18:43.745602 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745798 master-0 kubenswrapper[3979]: I0319 09:18:43.745622 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-cni-binary-copy\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745798 master-0 kubenswrapper[3979]: I0319 09:18:43.745641 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-socket-dir-parent\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745798 master-0 kubenswrapper[3979]: I0319 09:18:43.745663 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-netns\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.745798 master-0 kubenswrapper[3979]: I0319 09:18:43.745682 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-bin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.801563 master-0 kubenswrapper[3979]: I0319 09:18:43.801485 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-8kv6s"] Mar 19 09:18:43.802145 master-0 kubenswrapper[3979]: I0319 09:18:43.802113 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:43.803885 master-0 kubenswrapper[3979]: W0319 09:18:43.803784 3979 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'master-0' and this object Mar 19 09:18:43.804041 master-0 kubenswrapper[3979]: E0319 09:18:43.803900 3979 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:18:43.805747 master-0 kubenswrapper[3979]: I0319 09:18:43.805688 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 09:18:43.847130 master-0 kubenswrapper[3979]: I0319 09:18:43.847005 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-cni-binary-copy\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847130 master-0 kubenswrapper[3979]: I0319 09:18:43.847070 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-socket-dir-parent\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847633 master-0 kubenswrapper[3979]: I0319 09:18:43.847298 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-conf-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847633 master-0 kubenswrapper[3979]: I0319 09:18:43.847329 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847633 master-0 kubenswrapper[3979]: I0319 09:18:43.847354 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-netns\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847633 master-0 kubenswrapper[3979]: I0319 09:18:43.847381 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-bin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847633 master-0 kubenswrapper[3979]: I0319 09:18:43.847406 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-multus-certs\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847633 master-0 kubenswrapper[3979]: I0319 09:18:43.847411 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-socket-dir-parent\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847633 master-0 kubenswrapper[3979]: I0319 09:18:43.847566 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-conf-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847633 master-0 kubenswrapper[3979]: I0319 09:18:43.847629 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847915 master-0 kubenswrapper[3979]: I0319 09:18:43.847666 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-multus-certs\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847915 master-0 kubenswrapper[3979]: I0319 09:18:43.847705 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-netns\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847915 master-0 kubenswrapper[3979]: I0319 09:18:43.847738 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-bin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847915 master-0 kubenswrapper[3979]: I0319 09:18:43.847786 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-os-release\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847915 master-0 kubenswrapper[3979]: I0319 09:18:43.847836 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-multus\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847915 master-0 kubenswrapper[3979]: I0319 09:18:43.847868 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhzsr\" (UniqueName: \"kubernetes.io/projected/157e3524-eb27-41ca-b49d-2697ee1245ca-kube-api-access-qhzsr\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847915 master-0 kubenswrapper[3979]: I0319 09:18:43.847879 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-os-release\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847915 master-0 kubenswrapper[3979]: I0319 09:18:43.847894 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-etc-kubernetes\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.847915 master-0 kubenswrapper[3979]: I0319 09:18:43.847920 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-cnibin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.848190 master-0 kubenswrapper[3979]: I0319 09:18:43.847931 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-multus\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.848190 master-0 kubenswrapper[3979]: I0319 09:18:43.847944 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-k8s-cni-cncf-io\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.848190 master-0 kubenswrapper[3979]: I0319 09:18:43.848088 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-kubelet\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.848190 master-0 kubenswrapper[3979]: I0319 09:18:43.848119 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-system-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.848190 master-0 kubenswrapper[3979]: I0319 09:18:43.848138 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-kubelet\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.848190 master-0 kubenswrapper[3979]: I0319 09:18:43.848170 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-hostroot\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.848190 master-0 kubenswrapper[3979]: I0319 09:18:43.848191 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-k8s-cni-cncf-io\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.848410 master-0 kubenswrapper[3979]: I0319 09:18:43.848174 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-cnibin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.848410 master-0 kubenswrapper[3979]: I0319 09:18:43.848213 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-daemon-config\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.848410 master-0 kubenswrapper[3979]: I0319 09:18:43.848216 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-system-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.848410 master-0 kubenswrapper[3979]: I0319 09:18:43.848248 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-etc-kubernetes\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.848410 master-0 kubenswrapper[3979]: I0319 09:18:43.848259 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-hostroot\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.849154 master-0 kubenswrapper[3979]: I0319 09:18:43.849111 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-daemon-config\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.849295 master-0 kubenswrapper[3979]: I0319 09:18:43.849236 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-cni-binary-copy\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.874593 master-0 kubenswrapper[3979]: I0319 09:18:43.874432 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhzsr\" (UniqueName: \"kubernetes.io/projected/157e3524-eb27-41ca-b49d-2697ee1245ca-kube-api-access-qhzsr\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.935870 master-0 kubenswrapper[3979]: I0319 09:18:43.935792 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bzdzd" Mar 19 09:18:43.949322 master-0 kubenswrapper[3979]: I0319 09:18:43.949258 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:43.949472 master-0 kubenswrapper[3979]: I0319 09:18:43.949342 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:43.949472 master-0 kubenswrapper[3979]: I0319 09:18:43.949385 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxjqg\" (UniqueName: \"kubernetes.io/projected/979d4d12-a560-4309-a1d3-cbebe853e8ea-kube-api-access-rxjqg\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:43.949472 master-0 kubenswrapper[3979]: I0319 09:18:43.949420 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-system-cni-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:43.949472 master-0 kubenswrapper[3979]: I0319 09:18:43.949443 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-cnibin\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:43.949670 master-0 kubenswrapper[3979]: I0319 09:18:43.949477 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-os-release\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:43.949670 master-0 kubenswrapper[3979]: I0319 09:18:43.949582 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:43.949670 master-0 kubenswrapper[3979]: I0319 09:18:43.949624 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:43.950179 master-0 kubenswrapper[3979]: W0319 09:18:43.950129 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod157e3524_eb27_41ca_b49d_2697ee1245ca.slice/crio-9afd6ea2c1d8f05e8e4fc03f47178ac0a2f4931512d72e1dd34b6edbe52cf174 WatchSource:0}: Error finding container 9afd6ea2c1d8f05e8e4fc03f47178ac0a2f4931512d72e1dd34b6edbe52cf174: Status 404 returned error can't find the container with id 9afd6ea2c1d8f05e8e4fc03f47178ac0a2f4931512d72e1dd34b6edbe52cf174 Mar 19 09:18:44.050486 master-0 kubenswrapper[3979]: I0319 09:18:44.050409 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.050486 master-0 kubenswrapper[3979]: I0319 09:18:44.050467 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.050486 master-0 kubenswrapper[3979]: I0319 09:18:44.050491 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-os-release\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.050768 master-0 kubenswrapper[3979]: I0319 09:18:44.050514 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.051118 master-0 kubenswrapper[3979]: I0319 09:18:44.051090 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxjqg\" (UniqueName: \"kubernetes.io/projected/979d4d12-a560-4309-a1d3-cbebe853e8ea-kube-api-access-rxjqg\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.051156 master-0 kubenswrapper[3979]: I0319 09:18:44.051125 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.051156 master-0 kubenswrapper[3979]: I0319 09:18:44.051145 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-system-cni-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.051208 master-0 kubenswrapper[3979]: I0319 09:18:44.051164 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-cnibin\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.051257 master-0 kubenswrapper[3979]: I0319 09:18:44.051236 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-system-cni-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.051611 master-0 kubenswrapper[3979]: I0319 09:18:44.051570 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-cnibin\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.051763 master-0 kubenswrapper[3979]: I0319 09:18:44.051701 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-os-release\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.052046 master-0 kubenswrapper[3979]: I0319 09:18:44.051985 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.052435 master-0 kubenswrapper[3979]: I0319 09:18:44.052386 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.054590 master-0 kubenswrapper[3979]: I0319 09:18:44.054501 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.079963 master-0 kubenswrapper[3979]: I0319 09:18:44.079886 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxjqg\" (UniqueName: \"kubernetes.io/projected/979d4d12-a560-4309-a1d3-cbebe853e8ea-kube-api-access-rxjqg\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.128279 master-0 kubenswrapper[3979]: I0319 09:18:44.128075 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bzdzd" event={"ID":"157e3524-eb27-41ca-b49d-2697ee1245ca","Type":"ContainerStarted","Data":"9afd6ea2c1d8f05e8e4fc03f47178ac0a2f4931512d72e1dd34b6edbe52cf174"} Mar 19 09:18:44.889705 master-0 kubenswrapper[3979]: I0319 09:18:44.889618 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 09:18:44.962729 master-0 kubenswrapper[3979]: I0319 09:18:44.961223 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:18:44.962935 master-0 kubenswrapper[3979]: I0319 09:18:44.962832 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:44.979091 master-0 kubenswrapper[3979]: I0319 09:18:44.979010 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-nq9vs"] Mar 19 09:18:44.979570 master-0 kubenswrapper[3979]: I0319 09:18:44.979520 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:44.979660 master-0 kubenswrapper[3979]: E0319 09:18:44.979624 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:18:45.023229 master-0 kubenswrapper[3979]: I0319 09:18:45.023168 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:18:45.059405 master-0 kubenswrapper[3979]: I0319 09:18:45.059349 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:45.059570 master-0 kubenswrapper[3979]: I0319 09:18:45.059436 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clpb5\" (UniqueName: \"kubernetes.io/projected/13072c08-c77c-4170-9ebe-98d63968747b-kube-api-access-clpb5\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:45.131393 master-0 kubenswrapper[3979]: I0319 09:18:45.131294 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" event={"ID":"979d4d12-a560-4309-a1d3-cbebe853e8ea","Type":"ContainerStarted","Data":"44d9515c76d2b5369510d3737c2fe1814c5099a9199ebffb839eb4e657e0735e"} Mar 19 09:18:45.160877 master-0 kubenswrapper[3979]: I0319 09:18:45.160732 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:45.161084 master-0 kubenswrapper[3979]: I0319 09:18:45.160899 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clpb5\" (UniqueName: \"kubernetes.io/projected/13072c08-c77c-4170-9ebe-98d63968747b-kube-api-access-clpb5\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:45.161084 master-0 kubenswrapper[3979]: E0319 09:18:45.160973 3979 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:45.161084 master-0 kubenswrapper[3979]: E0319 09:18:45.161087 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:18:45.661055901 +0000 UTC m=+100.704043689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:45.175863 master-0 kubenswrapper[3979]: I0319 09:18:45.175763 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=1.175742289 podStartE2EDuration="1.175742289s" podCreationTimestamp="2026-03-19 09:18:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:18:45.175228374 +0000 UTC m=+100.218215952" watchObservedRunningTime="2026-03-19 09:18:45.175742289 +0000 UTC m=+100.218729877" Mar 19 09:18:45.374299 master-0 kubenswrapper[3979]: I0319 09:18:45.374220 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clpb5\" (UniqueName: \"kubernetes.io/projected/13072c08-c77c-4170-9ebe-98d63968747b-kube-api-access-clpb5\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:45.666712 master-0 kubenswrapper[3979]: I0319 09:18:45.666634 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:45.666938 master-0 kubenswrapper[3979]: E0319 09:18:45.666830 3979 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:45.666938 master-0 kubenswrapper[3979]: E0319 09:18:45.666910 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:18:46.666887138 +0000 UTC m=+101.709874726 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:46.675306 master-0 kubenswrapper[3979]: I0319 09:18:46.675228 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:46.675964 master-0 kubenswrapper[3979]: E0319 09:18:46.675452 3979 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:46.675964 master-0 kubenswrapper[3979]: E0319 09:18:46.675589 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:18:48.675556822 +0000 UTC m=+103.718544400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:46.782799 master-0 kubenswrapper[3979]: I0319 09:18:46.782733 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:46.782998 master-0 kubenswrapper[3979]: E0319 09:18:46.782898 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:18:48.693359 master-0 kubenswrapper[3979]: I0319 09:18:48.693281 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:48.697390 master-0 kubenswrapper[3979]: E0319 09:18:48.694851 3979 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:48.697390 master-0 kubenswrapper[3979]: E0319 09:18:48.695265 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:18:52.695132679 +0000 UTC m=+107.738120297 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:48.795518 master-0 kubenswrapper[3979]: I0319 09:18:48.794896 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:48.795518 master-0 kubenswrapper[3979]: E0319 09:18:48.795090 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:18:49.143045 master-0 kubenswrapper[3979]: I0319 09:18:49.142982 3979 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="11e09cac68fe5f9a91247cf89d443e062789ce0301fe0e6f213f48df912e0870" exitCode=0 Mar 19 09:18:49.143045 master-0 kubenswrapper[3979]: I0319 09:18:49.143037 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" event={"ID":"979d4d12-a560-4309-a1d3-cbebe853e8ea","Type":"ContainerDied","Data":"11e09cac68fe5f9a91247cf89d443e062789ce0301fe0e6f213f48df912e0870"} Mar 19 09:18:50.782542 master-0 kubenswrapper[3979]: I0319 09:18:50.782488 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:50.783109 master-0 kubenswrapper[3979]: E0319 09:18:50.782668 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:18:51.795292 master-0 kubenswrapper[3979]: I0319 09:18:51.795232 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 09:18:52.725125 master-0 kubenswrapper[3979]: I0319 09:18:52.725049 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:52.725353 master-0 kubenswrapper[3979]: E0319 09:18:52.725311 3979 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:52.725488 master-0 kubenswrapper[3979]: E0319 09:18:52.725455 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:19:00.725394458 +0000 UTC m=+115.768382236 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:52.782498 master-0 kubenswrapper[3979]: I0319 09:18:52.782439 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:52.782686 master-0 kubenswrapper[3979]: E0319 09:18:52.782656 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:18:54.539357 master-0 kubenswrapper[3979]: I0319 09:18:54.539269 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:18:54.540093 master-0 kubenswrapper[3979]: E0319 09:18:54.539516 3979 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:54.540093 master-0 kubenswrapper[3979]: E0319 09:18:54.539659 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:19:26.53962746 +0000 UTC m=+141.582615038 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:54.782211 master-0 kubenswrapper[3979]: I0319 09:18:54.782148 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:54.782413 master-0 kubenswrapper[3979]: E0319 09:18:54.782315 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:18:55.797951 master-0 kubenswrapper[3979]: I0319 09:18:55.797856 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=4.797829834 podStartE2EDuration="4.797829834s" podCreationTimestamp="2026-03-19 09:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:18:55.797495695 +0000 UTC m=+110.840483283" watchObservedRunningTime="2026-03-19 09:18:55.797829834 +0000 UTC m=+110.840817422" Mar 19 09:18:56.014442 master-0 kubenswrapper[3979]: I0319 09:18:56.014382 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7"] Mar 19 09:18:56.018361 master-0 kubenswrapper[3979]: I0319 09:18:56.018011 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.023282 master-0 kubenswrapper[3979]: I0319 09:18:56.022779 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:18:56.023282 master-0 kubenswrapper[3979]: I0319 09:18:56.022867 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:18:56.023282 master-0 kubenswrapper[3979]: I0319 09:18:56.022878 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:18:56.023282 master-0 kubenswrapper[3979]: I0319 09:18:56.023012 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:18:56.023282 master-0 kubenswrapper[3979]: I0319 09:18:56.023019 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:18:56.154424 master-0 kubenswrapper[3979]: I0319 09:18:56.154280 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.154424 master-0 kubenswrapper[3979]: I0319 09:18:56.154361 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-env-overrides\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.154424 master-0 kubenswrapper[3979]: I0319 09:18:56.154403 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.154913 master-0 kubenswrapper[3979]: I0319 09:18:56.154438 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtw68\" (UniqueName: \"kubernetes.io/projected/41659a48-5eea-41cd-8b2a-b683dc15cc11-kube-api-access-jtw68\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.207050 master-0 kubenswrapper[3979]: I0319 09:18:56.206998 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nfj77"] Mar 19 09:18:56.207677 master-0 kubenswrapper[3979]: I0319 09:18:56.207657 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.209646 master-0 kubenswrapper[3979]: I0319 09:18:56.209373 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:18:56.212973 master-0 kubenswrapper[3979]: I0319 09:18:56.212931 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:18:56.255606 master-0 kubenswrapper[3979]: I0319 09:18:56.254825 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-env-overrides\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.255606 master-0 kubenswrapper[3979]: I0319 09:18:56.254898 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtw68\" (UniqueName: \"kubernetes.io/projected/41659a48-5eea-41cd-8b2a-b683dc15cc11-kube-api-access-jtw68\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.255606 master-0 kubenswrapper[3979]: I0319 09:18:56.254927 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.255606 master-0 kubenswrapper[3979]: I0319 09:18:56.254966 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.256710 master-0 kubenswrapper[3979]: I0319 09:18:56.256666 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.259421 master-0 kubenswrapper[3979]: I0319 09:18:56.259364 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-env-overrides\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.264249 master-0 kubenswrapper[3979]: I0319 09:18:56.264028 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.270404 master-0 kubenswrapper[3979]: I0319 09:18:56.270375 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtw68\" (UniqueName: \"kubernetes.io/projected/41659a48-5eea-41cd-8b2a-b683dc15cc11-kube-api-access-jtw68\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.348624 master-0 kubenswrapper[3979]: I0319 09:18:56.348549 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:18:56.355774 master-0 kubenswrapper[3979]: I0319 09:18:56.355732 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-systemd\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.355774 master-0 kubenswrapper[3979]: I0319 09:18:56.355777 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-ovn\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.355930 master-0 kubenswrapper[3979]: I0319 09:18:56.355822 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-var-lib-openvswitch\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.355930 master-0 kubenswrapper[3979]: I0319 09:18:56.355849 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-log-socket\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356084 master-0 kubenswrapper[3979]: I0319 09:18:56.355955 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-run-ovn-kubernetes\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356084 master-0 kubenswrapper[3979]: I0319 09:18:56.356047 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-run-netns\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356288 master-0 kubenswrapper[3979]: I0319 09:18:56.356223 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356336 master-0 kubenswrapper[3979]: I0319 09:18:56.356301 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-systemd-units\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356383 master-0 kubenswrapper[3979]: I0319 09:18:56.356337 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-cni-bin\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356424 master-0 kubenswrapper[3979]: I0319 09:18:56.356378 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-cni-netd\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356424 master-0 kubenswrapper[3979]: I0319 09:18:56.356413 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-node-log\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356499 master-0 kubenswrapper[3979]: I0319 09:18:56.356447 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovn-node-metrics-cert\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356557 master-0 kubenswrapper[3979]: I0319 09:18:56.356516 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovnkube-config\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356608 master-0 kubenswrapper[3979]: I0319 09:18:56.356574 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-env-overrides\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356650 master-0 kubenswrapper[3979]: I0319 09:18:56.356603 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8szl\" (UniqueName: \"kubernetes.io/projected/e3ad145f-b791-4c0d-864a-8d7d6443f91a-kube-api-access-w8szl\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356690 master-0 kubenswrapper[3979]: I0319 09:18:56.356661 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-kubelet\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356732 master-0 kubenswrapper[3979]: I0319 09:18:56.356700 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-etc-openvswitch\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356774 master-0 kubenswrapper[3979]: I0319 09:18:56.356732 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-openvswitch\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356774 master-0 kubenswrapper[3979]: I0319 09:18:56.356757 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovnkube-script-lib\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.356852 master-0 kubenswrapper[3979]: I0319 09:18:56.356792 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-slash\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458025 master-0 kubenswrapper[3979]: I0319 09:18:56.457905 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-node-log\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458025 master-0 kubenswrapper[3979]: I0319 09:18:56.457961 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovn-node-metrics-cert\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458025 master-0 kubenswrapper[3979]: I0319 09:18:56.457989 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovnkube-config\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458242 master-0 kubenswrapper[3979]: I0319 09:18:56.458130 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-node-log\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458242 master-0 kubenswrapper[3979]: I0319 09:18:56.458165 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-env-overrides\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458699 master-0 kubenswrapper[3979]: I0319 09:18:56.458660 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8szl\" (UniqueName: \"kubernetes.io/projected/e3ad145f-b791-4c0d-864a-8d7d6443f91a-kube-api-access-w8szl\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458751 master-0 kubenswrapper[3979]: I0319 09:18:56.458713 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-kubelet\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458751 master-0 kubenswrapper[3979]: I0319 09:18:56.458739 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-etc-openvswitch\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458811 master-0 kubenswrapper[3979]: I0319 09:18:56.458761 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-openvswitch\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458811 master-0 kubenswrapper[3979]: I0319 09:18:56.458783 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovnkube-script-lib\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458869 master-0 kubenswrapper[3979]: I0319 09:18:56.458808 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-slash\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458869 master-0 kubenswrapper[3979]: I0319 09:18:56.458836 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-ovn\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458869 master-0 kubenswrapper[3979]: I0319 09:18:56.458858 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-systemd\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458949 master-0 kubenswrapper[3979]: I0319 09:18:56.458894 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-var-lib-openvswitch\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458949 master-0 kubenswrapper[3979]: I0319 09:18:56.458917 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-log-socket\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.458949 master-0 kubenswrapper[3979]: I0319 09:18:56.458942 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-run-ovn-kubernetes\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459027 master-0 kubenswrapper[3979]: I0319 09:18:56.458964 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-run-netns\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459027 master-0 kubenswrapper[3979]: I0319 09:18:56.459007 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459083 master-0 kubenswrapper[3979]: I0319 09:18:56.459035 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-systemd-units\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459083 master-0 kubenswrapper[3979]: I0319 09:18:56.459057 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-cni-bin\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459162 master-0 kubenswrapper[3979]: I0319 09:18:56.459069 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovnkube-config\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459162 master-0 kubenswrapper[3979]: I0319 09:18:56.459117 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-env-overrides\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459162 master-0 kubenswrapper[3979]: I0319 09:18:56.459089 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-cni-netd\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459286 master-0 kubenswrapper[3979]: I0319 09:18:56.459185 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-openvswitch\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459286 master-0 kubenswrapper[3979]: I0319 09:18:56.459203 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-kubelet\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459286 master-0 kubenswrapper[3979]: I0319 09:18:56.459231 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-etc-openvswitch\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459286 master-0 kubenswrapper[3979]: I0319 09:18:56.459240 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-cni-bin\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459286 master-0 kubenswrapper[3979]: I0319 09:18:56.459206 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-run-netns\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459286 master-0 kubenswrapper[3979]: I0319 09:18:56.459284 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-systemd-units\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459504 master-0 kubenswrapper[3979]: I0319 09:18:56.459309 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-log-socket\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459504 master-0 kubenswrapper[3979]: I0319 09:18:56.459330 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-var-lib-openvswitch\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459504 master-0 kubenswrapper[3979]: I0319 09:18:56.459334 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-slash\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459504 master-0 kubenswrapper[3979]: I0319 09:18:56.459349 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-cni-netd\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459504 master-0 kubenswrapper[3979]: I0319 09:18:56.459372 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459504 master-0 kubenswrapper[3979]: I0319 09:18:56.459311 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-ovn\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459504 master-0 kubenswrapper[3979]: I0319 09:18:56.459390 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-systemd\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.459504 master-0 kubenswrapper[3979]: I0319 09:18:56.459436 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-run-ovn-kubernetes\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.460208 master-0 kubenswrapper[3979]: I0319 09:18:56.460164 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovnkube-script-lib\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.466421 master-0 kubenswrapper[3979]: I0319 09:18:56.465901 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovn-node-metrics-cert\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.479585 master-0 kubenswrapper[3979]: I0319 09:18:56.479519 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8szl\" (UniqueName: \"kubernetes.io/projected/e3ad145f-b791-4c0d-864a-8d7d6443f91a-kube-api-access-w8szl\") pod \"ovnkube-node-nfj77\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.524097 master-0 kubenswrapper[3979]: I0319 09:18:56.524034 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:18:56.782892 master-0 kubenswrapper[3979]: I0319 09:18:56.782801 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:56.783107 master-0 kubenswrapper[3979]: E0319 09:18:56.782974 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:18:57.926313 master-0 kubenswrapper[3979]: W0319 09:18:57.926243 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ad145f_b791_4c0d_864a_8d7d6443f91a.slice/crio-2724f078765cc41b21ea464b50fe169d860dc07093801eacc92a75b30e3593f5 WatchSource:0}: Error finding container 2724f078765cc41b21ea464b50fe169d860dc07093801eacc92a75b30e3593f5: Status 404 returned error can't find the container with id 2724f078765cc41b21ea464b50fe169d860dc07093801eacc92a75b30e3593f5 Mar 19 09:18:57.926963 master-0 kubenswrapper[3979]: W0319 09:18:57.926895 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41659a48_5eea_41cd_8b2a_b683dc15cc11.slice/crio-c67ba3f4e9bb95eef468edeb24c18cd6982feefa1823f748db64378aa999c140 WatchSource:0}: Error finding container c67ba3f4e9bb95eef468edeb24c18cd6982feefa1823f748db64378aa999c140: Status 404 returned error can't find the container with id c67ba3f4e9bb95eef468edeb24c18cd6982feefa1823f748db64378aa999c140 Mar 19 09:18:58.165414 master-0 kubenswrapper[3979]: I0319 09:18:58.165315 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" event={"ID":"979d4d12-a560-4309-a1d3-cbebe853e8ea","Type":"ContainerStarted","Data":"e715d0ff200bfc6a3198a0daa26814bad61e6acd8631c88afff9d4a08fe673ba"} Mar 19 09:18:58.168036 master-0 kubenswrapper[3979]: I0319 09:18:58.167898 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" event={"ID":"41659a48-5eea-41cd-8b2a-b683dc15cc11","Type":"ContainerStarted","Data":"4e81419e8a3661bab4b98b46c48f88f0c10dbce82380aeeca4ccc98648a34a84"} Mar 19 09:18:58.168036 master-0 kubenswrapper[3979]: I0319 09:18:58.167946 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" event={"ID":"41659a48-5eea-41cd-8b2a-b683dc15cc11","Type":"ContainerStarted","Data":"c67ba3f4e9bb95eef468edeb24c18cd6982feefa1823f748db64378aa999c140"} Mar 19 09:18:58.169293 master-0 kubenswrapper[3979]: I0319 09:18:58.169227 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bzdzd" event={"ID":"157e3524-eb27-41ca-b49d-2697ee1245ca","Type":"ContainerStarted","Data":"2d3477c3a9725b873c8e5413ca72191db0e07b17ecaa8a6d3f792473fd194137"} Mar 19 09:18:58.170585 master-0 kubenswrapper[3979]: I0319 09:18:58.170501 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerStarted","Data":"2724f078765cc41b21ea464b50fe169d860dc07093801eacc92a75b30e3593f5"} Mar 19 09:18:58.221085 master-0 kubenswrapper[3979]: I0319 09:18:58.220946 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bzdzd" podStartSLOduration=1.190764543 podStartE2EDuration="15.220906952s" podCreationTimestamp="2026-03-19 09:18:43 +0000 UTC" firstStartedPulling="2026-03-19 09:18:43.952938947 +0000 UTC m=+98.995926535" lastFinishedPulling="2026-03-19 09:18:57.983081366 +0000 UTC m=+113.026068944" observedRunningTime="2026-03-19 09:18:58.219804581 +0000 UTC m=+113.262792199" watchObservedRunningTime="2026-03-19 09:18:58.220906952 +0000 UTC m=+113.263894570" Mar 19 09:18:58.782283 master-0 kubenswrapper[3979]: I0319 09:18:58.782235 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:18:58.782571 master-0 kubenswrapper[3979]: E0319 09:18:58.782393 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:18:59.179514 master-0 kubenswrapper[3979]: I0319 09:18:59.178441 3979 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="e715d0ff200bfc6a3198a0daa26814bad61e6acd8631c88afff9d4a08fe673ba" exitCode=0 Mar 19 09:18:59.179514 master-0 kubenswrapper[3979]: I0319 09:18:59.178575 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" event={"ID":"979d4d12-a560-4309-a1d3-cbebe853e8ea","Type":"ContainerDied","Data":"e715d0ff200bfc6a3198a0daa26814bad61e6acd8631c88afff9d4a08fe673ba"} Mar 19 09:18:59.196747 master-0 kubenswrapper[3979]: I0319 09:18:59.196693 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-4s5vc"] Mar 19 09:18:59.197121 master-0 kubenswrapper[3979]: I0319 09:18:59.197090 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:18:59.197188 master-0 kubenswrapper[3979]: E0319 09:18:59.197160 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:18:59.282820 master-0 kubenswrapper[3979]: I0319 09:18:59.282626 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:18:59.388558 master-0 kubenswrapper[3979]: I0319 09:18:59.387849 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:18:59.443908 master-0 kubenswrapper[3979]: E0319 09:18:59.443784 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:18:59.443908 master-0 kubenswrapper[3979]: E0319 09:18:59.443838 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:18:59.443908 master-0 kubenswrapper[3979]: E0319 09:18:59.443858 3979 projected.go:194] Error preparing data for projected volume kube-api-access-tpgbq for pod openshift-network-diagnostics/network-check-target-4s5vc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:59.444141 master-0 kubenswrapper[3979]: E0319 09:18:59.443950 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq podName:10c609bb-136a-4ce2-b9e2-0a03e1a37a62 nodeName:}" failed. No retries permitted until 2026-03-19 09:18:59.943922099 +0000 UTC m=+114.986909677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tpgbq" (UniqueName: "kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq") pod "network-check-target-4s5vc" (UID: "10c609bb-136a-4ce2-b9e2-0a03e1a37a62") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:59.992772 master-0 kubenswrapper[3979]: I0319 09:18:59.992707 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:18:59.992970 master-0 kubenswrapper[3979]: E0319 09:18:59.992930 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:18:59.992970 master-0 kubenswrapper[3979]: E0319 09:18:59.992952 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:18:59.992970 master-0 kubenswrapper[3979]: E0319 09:18:59.992965 3979 projected.go:194] Error preparing data for projected volume kube-api-access-tpgbq for pod openshift-network-diagnostics/network-check-target-4s5vc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:59.993101 master-0 kubenswrapper[3979]: E0319 09:18:59.993017 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq podName:10c609bb-136a-4ce2-b9e2-0a03e1a37a62 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:00.992999916 +0000 UTC m=+116.035987494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tpgbq" (UniqueName: "kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq") pod "network-check-target-4s5vc" (UID: "10c609bb-136a-4ce2-b9e2-0a03e1a37a62") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:00.782673 master-0 kubenswrapper[3979]: I0319 09:19:00.782627 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:00.783120 master-0 kubenswrapper[3979]: I0319 09:19:00.782693 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:00.783120 master-0 kubenswrapper[3979]: E0319 09:19:00.782759 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:00.783120 master-0 kubenswrapper[3979]: E0319 09:19:00.782839 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:00.800493 master-0 kubenswrapper[3979]: I0319 09:19:00.800438 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:00.800683 master-0 kubenswrapper[3979]: E0319 09:19:00.800642 3979 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:19:00.800751 master-0 kubenswrapper[3979]: E0319 09:19:00.800732 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:19:16.800709166 +0000 UTC m=+131.843696744 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:19:01.002891 master-0 kubenswrapper[3979]: I0319 09:19:01.002841 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:01.003077 master-0 kubenswrapper[3979]: E0319 09:19:01.003049 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:19:01.003077 master-0 kubenswrapper[3979]: E0319 09:19:01.003072 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:19:01.003354 master-0 kubenswrapper[3979]: E0319 09:19:01.003083 3979 projected.go:194] Error preparing data for projected volume kube-api-access-tpgbq for pod openshift-network-diagnostics/network-check-target-4s5vc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:01.003354 master-0 kubenswrapper[3979]: E0319 09:19:01.003330 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq podName:10c609bb-136a-4ce2-b9e2-0a03e1a37a62 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:03.003309404 +0000 UTC m=+118.046296972 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tpgbq" (UniqueName: "kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq") pod "network-check-target-4s5vc" (UID: "10c609bb-136a-4ce2-b9e2-0a03e1a37a62") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:01.187107 master-0 kubenswrapper[3979]: I0319 09:19:01.187041 3979 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="886f43f428dc7d770e78699ea2b9793dc0fcaa7dc9eeaeafd637bd2727c22201" exitCode=0 Mar 19 09:19:01.187107 master-0 kubenswrapper[3979]: I0319 09:19:01.187093 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" event={"ID":"979d4d12-a560-4309-a1d3-cbebe853e8ea","Type":"ContainerDied","Data":"886f43f428dc7d770e78699ea2b9793dc0fcaa7dc9eeaeafd637bd2727c22201"} Mar 19 09:19:01.798504 master-0 kubenswrapper[3979]: I0319 09:19:01.798450 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-slmgx"] Mar 19 09:19:01.799276 master-0 kubenswrapper[3979]: I0319 09:19:01.798940 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:01.801271 master-0 kubenswrapper[3979]: I0319 09:19:01.801225 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:19:01.801484 master-0 kubenswrapper[3979]: I0319 09:19:01.801457 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:19:01.801731 master-0 kubenswrapper[3979]: I0319 09:19:01.801705 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:19:01.802363 master-0 kubenswrapper[3979]: I0319 09:19:01.801854 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:19:01.802363 master-0 kubenswrapper[3979]: I0319 09:19:01.802014 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:19:01.916707 master-0 kubenswrapper[3979]: I0319 09:19:01.916660 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfq74\" (UniqueName: \"kubernetes.io/projected/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-kube-api-access-sfq74\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:01.916890 master-0 kubenswrapper[3979]: I0319 09:19:01.916751 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-webhook-cert\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:01.916890 master-0 kubenswrapper[3979]: I0319 09:19:01.916774 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-env-overrides\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:01.916890 master-0 kubenswrapper[3979]: I0319 09:19:01.916789 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-ovnkube-identity-cm\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:02.018202 master-0 kubenswrapper[3979]: I0319 09:19:02.018102 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-webhook-cert\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:02.018202 master-0 kubenswrapper[3979]: I0319 09:19:02.018163 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-env-overrides\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:02.018451 master-0 kubenswrapper[3979]: E0319 09:19:02.018302 3979 secret.go:189] Couldn't get secret openshift-network-node-identity/network-node-identity-cert: secret "network-node-identity-cert" not found Mar 19 09:19:02.018451 master-0 kubenswrapper[3979]: I0319 09:19:02.018391 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-ovnkube-identity-cm\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:02.018451 master-0 kubenswrapper[3979]: E0319 09:19:02.018419 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-webhook-cert podName:58ea8fcc-29b2-48ef-8629-2ba217c9d70c nodeName:}" failed. No retries permitted until 2026-03-19 09:19:02.518391285 +0000 UTC m=+117.561378863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-webhook-cert") pod "network-node-identity-slmgx" (UID: "58ea8fcc-29b2-48ef-8629-2ba217c9d70c") : secret "network-node-identity-cert" not found Mar 19 09:19:02.018621 master-0 kubenswrapper[3979]: I0319 09:19:02.018469 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfq74\" (UniqueName: \"kubernetes.io/projected/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-kube-api-access-sfq74\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:02.019509 master-0 kubenswrapper[3979]: I0319 09:19:02.019438 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-env-overrides\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:02.019981 master-0 kubenswrapper[3979]: I0319 09:19:02.019926 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-ovnkube-identity-cm\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:02.035031 master-0 kubenswrapper[3979]: I0319 09:19:02.034967 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfq74\" (UniqueName: \"kubernetes.io/projected/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-kube-api-access-sfq74\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:02.522394 master-0 kubenswrapper[3979]: I0319 09:19:02.522342 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-webhook-cert\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:02.525445 master-0 kubenswrapper[3979]: I0319 09:19:02.525419 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-webhook-cert\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:02.714663 master-0 kubenswrapper[3979]: I0319 09:19:02.714595 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:19:02.727823 master-0 kubenswrapper[3979]: W0319 09:19:02.727754 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58ea8fcc_29b2_48ef_8629_2ba217c9d70c.slice/crio-ef86c160aaf4ed6a2febd660641341c71096c2c568217ab433cd656af3876942 WatchSource:0}: Error finding container ef86c160aaf4ed6a2febd660641341c71096c2c568217ab433cd656af3876942: Status 404 returned error can't find the container with id ef86c160aaf4ed6a2febd660641341c71096c2c568217ab433cd656af3876942 Mar 19 09:19:02.782928 master-0 kubenswrapper[3979]: I0319 09:19:02.782892 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:02.783058 master-0 kubenswrapper[3979]: I0319 09:19:02.782892 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:02.783058 master-0 kubenswrapper[3979]: E0319 09:19:02.783034 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:02.783122 master-0 kubenswrapper[3979]: E0319 09:19:02.783072 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:03.027968 master-0 kubenswrapper[3979]: I0319 09:19:03.027849 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:03.028393 master-0 kubenswrapper[3979]: E0319 09:19:03.028110 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:19:03.028393 master-0 kubenswrapper[3979]: E0319 09:19:03.028177 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:19:03.028393 master-0 kubenswrapper[3979]: E0319 09:19:03.028194 3979 projected.go:194] Error preparing data for projected volume kube-api-access-tpgbq for pod openshift-network-diagnostics/network-check-target-4s5vc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:03.028393 master-0 kubenswrapper[3979]: E0319 09:19:03.028285 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq podName:10c609bb-136a-4ce2-b9e2-0a03e1a37a62 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:07.028256631 +0000 UTC m=+122.071244289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tpgbq" (UniqueName: "kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq") pod "network-check-target-4s5vc" (UID: "10c609bb-136a-4ce2-b9e2-0a03e1a37a62") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:03.204426 master-0 kubenswrapper[3979]: I0319 09:19:03.204384 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" event={"ID":"979d4d12-a560-4309-a1d3-cbebe853e8ea","Type":"ContainerStarted","Data":"2dce09604f673a98b5b76aa5ab393a537cdfc70dd6be1c99472f960c60ad55b9"} Mar 19 09:19:03.205866 master-0 kubenswrapper[3979]: I0319 09:19:03.205845 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-slmgx" event={"ID":"58ea8fcc-29b2-48ef-8629-2ba217c9d70c","Type":"ContainerStarted","Data":"ef86c160aaf4ed6a2febd660641341c71096c2c568217ab433cd656af3876942"} Mar 19 09:19:04.212342 master-0 kubenswrapper[3979]: I0319 09:19:04.212232 3979 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="2dce09604f673a98b5b76aa5ab393a537cdfc70dd6be1c99472f960c60ad55b9" exitCode=0 Mar 19 09:19:04.212342 master-0 kubenswrapper[3979]: I0319 09:19:04.212297 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" event={"ID":"979d4d12-a560-4309-a1d3-cbebe853e8ea","Type":"ContainerDied","Data":"2dce09604f673a98b5b76aa5ab393a537cdfc70dd6be1c99472f960c60ad55b9"} Mar 19 09:19:04.782890 master-0 kubenswrapper[3979]: I0319 09:19:04.782385 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:04.782890 master-0 kubenswrapper[3979]: I0319 09:19:04.782459 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:04.782890 master-0 kubenswrapper[3979]: E0319 09:19:04.782550 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:04.782890 master-0 kubenswrapper[3979]: E0319 09:19:04.782644 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:05.674865 master-0 kubenswrapper[3979]: E0319 09:19:05.674797 3979 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 19 09:19:05.778687 master-0 kubenswrapper[3979]: E0319 09:19:05.778539 3979 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:06.783033 master-0 kubenswrapper[3979]: I0319 09:19:06.782987 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:06.784285 master-0 kubenswrapper[3979]: I0319 09:19:06.782999 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:06.784285 master-0 kubenswrapper[3979]: E0319 09:19:06.783139 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:06.784285 master-0 kubenswrapper[3979]: E0319 09:19:06.783205 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:07.067115 master-0 kubenswrapper[3979]: I0319 09:19:07.066989 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:07.067328 master-0 kubenswrapper[3979]: E0319 09:19:07.067276 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:19:07.067328 master-0 kubenswrapper[3979]: E0319 09:19:07.067317 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:19:07.067328 master-0 kubenswrapper[3979]: E0319 09:19:07.067330 3979 projected.go:194] Error preparing data for projected volume kube-api-access-tpgbq for pod openshift-network-diagnostics/network-check-target-4s5vc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:07.067464 master-0 kubenswrapper[3979]: E0319 09:19:07.067404 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq podName:10c609bb-136a-4ce2-b9e2-0a03e1a37a62 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:15.067384854 +0000 UTC m=+130.110372622 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tpgbq" (UniqueName: "kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq") pod "network-check-target-4s5vc" (UID: "10c609bb-136a-4ce2-b9e2-0a03e1a37a62") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:07.797575 master-0 kubenswrapper[3979]: I0319 09:19:07.797499 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 19 09:19:08.782185 master-0 kubenswrapper[3979]: I0319 09:19:08.782104 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:08.782185 master-0 kubenswrapper[3979]: I0319 09:19:08.782158 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:08.782483 master-0 kubenswrapper[3979]: E0319 09:19:08.782315 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:08.782636 master-0 kubenswrapper[3979]: E0319 09:19:08.782549 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:10.779913 master-0 kubenswrapper[3979]: E0319 09:19:10.779855 3979 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:10.782165 master-0 kubenswrapper[3979]: I0319 09:19:10.782126 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:10.782226 master-0 kubenswrapper[3979]: I0319 09:19:10.782168 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:10.782266 master-0 kubenswrapper[3979]: E0319 09:19:10.782246 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:10.784641 master-0 kubenswrapper[3979]: E0319 09:19:10.782757 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:12.782316 master-0 kubenswrapper[3979]: I0319 09:19:12.782267 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:12.782316 master-0 kubenswrapper[3979]: I0319 09:19:12.782289 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:12.782852 master-0 kubenswrapper[3979]: E0319 09:19:12.782391 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:12.782852 master-0 kubenswrapper[3979]: E0319 09:19:12.782463 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:14.782706 master-0 kubenswrapper[3979]: I0319 09:19:14.782613 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:14.783332 master-0 kubenswrapper[3979]: I0319 09:19:14.782613 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:14.783332 master-0 kubenswrapper[3979]: E0319 09:19:14.782810 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:14.783332 master-0 kubenswrapper[3979]: E0319 09:19:14.782931 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:15.141207 master-0 kubenswrapper[3979]: I0319 09:19:15.141096 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:15.141593 master-0 kubenswrapper[3979]: E0319 09:19:15.141280 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:19:15.141593 master-0 kubenswrapper[3979]: E0319 09:19:15.141301 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:19:15.141593 master-0 kubenswrapper[3979]: E0319 09:19:15.141313 3979 projected.go:194] Error preparing data for projected volume kube-api-access-tpgbq for pod openshift-network-diagnostics/network-check-target-4s5vc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:15.141593 master-0 kubenswrapper[3979]: E0319 09:19:15.141364 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq podName:10c609bb-136a-4ce2-b9e2-0a03e1a37a62 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:31.141350162 +0000 UTC m=+146.184337740 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tpgbq" (UniqueName: "kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq") pod "network-check-target-4s5vc" (UID: "10c609bb-136a-4ce2-b9e2-0a03e1a37a62") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:15.781311 master-0 kubenswrapper[3979]: E0319 09:19:15.781169 3979 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:16.782789 master-0 kubenswrapper[3979]: I0319 09:19:16.782649 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:16.783406 master-0 kubenswrapper[3979]: E0319 09:19:16.782853 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:16.783406 master-0 kubenswrapper[3979]: I0319 09:19:16.782885 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:16.783406 master-0 kubenswrapper[3979]: E0319 09:19:16.783068 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:16.859671 master-0 kubenswrapper[3979]: I0319 09:19:16.859518 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:16.859943 master-0 kubenswrapper[3979]: E0319 09:19:16.859774 3979 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:19:16.859943 master-0 kubenswrapper[3979]: E0319 09:19:16.859865 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:19:48.85984722 +0000 UTC m=+163.902834798 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:19:17.358600 master-0 kubenswrapper[3979]: I0319 09:19:17.358519 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=10.358499598 podStartE2EDuration="10.358499598s" podCreationTimestamp="2026-03-19 09:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:19:17.358339814 +0000 UTC m=+132.401327412" watchObservedRunningTime="2026-03-19 09:19:17.358499598 +0000 UTC m=+132.401487196" Mar 19 09:19:18.782319 master-0 kubenswrapper[3979]: I0319 09:19:18.782265 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:18.782939 master-0 kubenswrapper[3979]: E0319 09:19:18.782392 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:18.782939 master-0 kubenswrapper[3979]: I0319 09:19:18.782726 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:18.783827 master-0 kubenswrapper[3979]: E0319 09:19:18.783798 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:20.782482 master-0 kubenswrapper[3979]: I0319 09:19:20.782063 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:20.782482 master-0 kubenswrapper[3979]: I0319 09:19:20.782074 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:20.782482 master-0 kubenswrapper[3979]: E0319 09:19:20.782270 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:20.782482 master-0 kubenswrapper[3979]: E0319 09:19:20.782335 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:20.783279 master-0 kubenswrapper[3979]: E0319 09:19:20.782564 3979 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:22.782728 master-0 kubenswrapper[3979]: I0319 09:19:22.782625 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:22.783214 master-0 kubenswrapper[3979]: E0319 09:19:22.782796 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:22.784232 master-0 kubenswrapper[3979]: I0319 09:19:22.783663 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:22.784232 master-0 kubenswrapper[3979]: E0319 09:19:22.783795 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:24.782361 master-0 kubenswrapper[3979]: I0319 09:19:24.782278 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:24.783191 master-0 kubenswrapper[3979]: E0319 09:19:24.782414 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:24.783191 master-0 kubenswrapper[3979]: I0319 09:19:24.782278 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:24.783191 master-0 kubenswrapper[3979]: E0319 09:19:24.782589 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:25.783785 master-0 kubenswrapper[3979]: E0319 09:19:25.783697 3979 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:26.581120 master-0 kubenswrapper[3979]: I0319 09:19:26.581036 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:19:26.581456 master-0 kubenswrapper[3979]: E0319 09:19:26.581318 3979 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:26.581456 master-0 kubenswrapper[3979]: E0319 09:19:26.581405 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:30.581387758 +0000 UTC m=+205.624375336 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:26.782804 master-0 kubenswrapper[3979]: I0319 09:19:26.782750 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:26.782988 master-0 kubenswrapper[3979]: I0319 09:19:26.782773 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:26.783030 master-0 kubenswrapper[3979]: E0319 09:19:26.782879 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:26.783076 master-0 kubenswrapper[3979]: E0319 09:19:26.783004 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:28.783075 master-0 kubenswrapper[3979]: I0319 09:19:28.782981 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:28.783075 master-0 kubenswrapper[3979]: I0319 09:19:28.783067 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:28.783768 master-0 kubenswrapper[3979]: E0319 09:19:28.783186 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:28.783768 master-0 kubenswrapper[3979]: E0319 09:19:28.783354 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:29.269111 master-0 kubenswrapper[3979]: I0319 09:19:29.267575 3979 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nfj77"] Mar 19 09:19:29.291349 master-0 kubenswrapper[3979]: I0319 09:19:29.291284 3979 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="3d15d0fa4a3f8c9035c8ce9b72d3cf571d79c5e3676c413632c3d1ba3c37a426" exitCode=0 Mar 19 09:19:29.291605 master-0 kubenswrapper[3979]: I0319 09:19:29.291361 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" event={"ID":"979d4d12-a560-4309-a1d3-cbebe853e8ea","Type":"ContainerDied","Data":"3d15d0fa4a3f8c9035c8ce9b72d3cf571d79c5e3676c413632c3d1ba3c37a426"} Mar 19 09:19:29.295849 master-0 kubenswrapper[3979]: I0319 09:19:29.295801 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" event={"ID":"41659a48-5eea-41cd-8b2a-b683dc15cc11","Type":"ContainerStarted","Data":"1e26847fb86eb6f61757cb6db0dd5524be844b8153caf7a191fdb5d34b73f968"} Mar 19 09:19:29.297204 master-0 kubenswrapper[3979]: I0319 09:19:29.297161 3979 generic.go:334] "Generic (PLEG): container finished" podID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerID="519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6" exitCode=0 Mar 19 09:19:29.297241 master-0 kubenswrapper[3979]: I0319 09:19:29.297224 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerDied","Data":"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6"} Mar 19 09:19:29.302049 master-0 kubenswrapper[3979]: I0319 09:19:29.302011 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-slmgx" event={"ID":"58ea8fcc-29b2-48ef-8629-2ba217c9d70c","Type":"ContainerStarted","Data":"f3f8d46c311f6229c7f187208601460b92a12322f95de2e7929223853ba347eb"} Mar 19 09:19:29.551765 master-0 kubenswrapper[3979]: I0319 09:19:29.549842 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" podStartSLOduration=4.642166915 podStartE2EDuration="34.549817757s" podCreationTimestamp="2026-03-19 09:18:55 +0000 UTC" firstStartedPulling="2026-03-19 09:18:58.106941521 +0000 UTC m=+113.149929099" lastFinishedPulling="2026-03-19 09:19:28.014592353 +0000 UTC m=+143.057579941" observedRunningTime="2026-03-19 09:19:29.549240263 +0000 UTC m=+144.592227861" watchObservedRunningTime="2026-03-19 09:19:29.549817757 +0000 UTC m=+144.592805335" Mar 19 09:19:30.310132 master-0 kubenswrapper[3979]: I0319 09:19:30.310075 3979 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="1de935d5d79686ee37ae77f43c7f709d103c6ab561712f1da495ac19ccceba4b" exitCode=0 Mar 19 09:19:30.311282 master-0 kubenswrapper[3979]: I0319 09:19:30.310163 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" event={"ID":"979d4d12-a560-4309-a1d3-cbebe853e8ea","Type":"ContainerDied","Data":"1de935d5d79686ee37ae77f43c7f709d103c6ab561712f1da495ac19ccceba4b"} Mar 19 09:19:30.317081 master-0 kubenswrapper[3979]: I0319 09:19:30.317016 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerStarted","Data":"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf"} Mar 19 09:19:30.317081 master-0 kubenswrapper[3979]: I0319 09:19:30.317059 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerStarted","Data":"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee"} Mar 19 09:19:30.317081 master-0 kubenswrapper[3979]: I0319 09:19:30.317070 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerStarted","Data":"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090"} Mar 19 09:19:30.317081 master-0 kubenswrapper[3979]: I0319 09:19:30.317079 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerStarted","Data":"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c"} Mar 19 09:19:30.317081 master-0 kubenswrapper[3979]: I0319 09:19:30.317087 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerStarted","Data":"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c"} Mar 19 09:19:30.317081 master-0 kubenswrapper[3979]: I0319 09:19:30.317096 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerStarted","Data":"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17"} Mar 19 09:19:30.319666 master-0 kubenswrapper[3979]: I0319 09:19:30.319609 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-slmgx" event={"ID":"58ea8fcc-29b2-48ef-8629-2ba217c9d70c","Type":"ContainerStarted","Data":"16dabbfac23a88b18e7a1e5f639f318226358e768cd4e0f4bf6b8327e7b845c9"} Mar 19 09:19:30.782908 master-0 kubenswrapper[3979]: I0319 09:19:30.782829 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:30.783093 master-0 kubenswrapper[3979]: E0319 09:19:30.782989 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:30.783093 master-0 kubenswrapper[3979]: I0319 09:19:30.782830 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:30.783209 master-0 kubenswrapper[3979]: E0319 09:19:30.783092 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:30.784687 master-0 kubenswrapper[3979]: E0319 09:19:30.784648 3979 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:31.231500 master-0 kubenswrapper[3979]: I0319 09:19:31.231368 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:31.231825 master-0 kubenswrapper[3979]: E0319 09:19:31.231666 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:19:31.231825 master-0 kubenswrapper[3979]: E0319 09:19:31.231707 3979 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:19:31.231825 master-0 kubenswrapper[3979]: E0319 09:19:31.231732 3979 projected.go:194] Error preparing data for projected volume kube-api-access-tpgbq for pod openshift-network-diagnostics/network-check-target-4s5vc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:31.231825 master-0 kubenswrapper[3979]: E0319 09:19:31.231826 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq podName:10c609bb-136a-4ce2-b9e2-0a03e1a37a62 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:03.231794736 +0000 UTC m=+178.274782354 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-tpgbq" (UniqueName: "kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq") pod "network-check-target-4s5vc" (UID: "10c609bb-136a-4ce2-b9e2-0a03e1a37a62") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:31.326775 master-0 kubenswrapper[3979]: I0319 09:19:31.326658 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" event={"ID":"979d4d12-a560-4309-a1d3-cbebe853e8ea","Type":"ContainerStarted","Data":"5927b26a07b31048c0de1055a0753bec45a8b78ac90b0e1ebe3ecd80872c8a68"} Mar 19 09:19:31.486352 master-0 kubenswrapper[3979]: I0319 09:19:31.486150 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-slmgx" podStartSLOduration=5.200945215 podStartE2EDuration="30.486130063s" podCreationTimestamp="2026-03-19 09:19:01 +0000 UTC" firstStartedPulling="2026-03-19 09:19:02.730252756 +0000 UTC m=+117.773240334" lastFinishedPulling="2026-03-19 09:19:28.015437564 +0000 UTC m=+143.058425182" observedRunningTime="2026-03-19 09:19:31.485650902 +0000 UTC m=+146.528638500" watchObservedRunningTime="2026-03-19 09:19:31.486130063 +0000 UTC m=+146.529117641" Mar 19 09:19:32.339330 master-0 kubenswrapper[3979]: I0319 09:19:32.339135 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerStarted","Data":"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1"} Mar 19 09:19:32.782700 master-0 kubenswrapper[3979]: I0319 09:19:32.782596 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:32.782905 master-0 kubenswrapper[3979]: I0319 09:19:32.782596 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:32.782905 master-0 kubenswrapper[3979]: E0319 09:19:32.782812 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:32.782905 master-0 kubenswrapper[3979]: E0319 09:19:32.782868 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:34.782967 master-0 kubenswrapper[3979]: I0319 09:19:34.782478 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:34.784249 master-0 kubenswrapper[3979]: E0319 09:19:34.783042 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:34.784249 master-0 kubenswrapper[3979]: I0319 09:19:34.782516 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:34.784249 master-0 kubenswrapper[3979]: E0319 09:19:34.783285 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:35.353254 master-0 kubenswrapper[3979]: I0319 09:19:35.352107 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerStarted","Data":"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00"} Mar 19 09:19:35.353254 master-0 kubenswrapper[3979]: I0319 09:19:35.352339 3979 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="ovn-controller" containerID="cri-o://6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17" gracePeriod=30 Mar 19 09:19:35.353254 master-0 kubenswrapper[3979]: I0319 09:19:35.352559 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:19:35.353254 master-0 kubenswrapper[3979]: I0319 09:19:35.352587 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:19:35.353254 master-0 kubenswrapper[3979]: I0319 09:19:35.352641 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:19:35.353254 master-0 kubenswrapper[3979]: I0319 09:19:35.352890 3979 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="sbdb" containerID="cri-o://20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1" gracePeriod=30 Mar 19 09:19:35.353254 master-0 kubenswrapper[3979]: I0319 09:19:35.352945 3979 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="nbdb" containerID="cri-o://0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf" gracePeriod=30 Mar 19 09:19:35.353254 master-0 kubenswrapper[3979]: I0319 09:19:35.352987 3979 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="northd" containerID="cri-o://d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee" gracePeriod=30 Mar 19 09:19:35.353254 master-0 kubenswrapper[3979]: I0319 09:19:35.353027 3979 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090" gracePeriod=30 Mar 19 09:19:35.353254 master-0 kubenswrapper[3979]: I0319 09:19:35.353068 3979 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="kube-rbac-proxy-node" containerID="cri-o://80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c" gracePeriod=30 Mar 19 09:19:35.353254 master-0 kubenswrapper[3979]: I0319 09:19:35.353110 3979 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="ovn-acl-logging" containerID="cri-o://c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c" gracePeriod=30 Mar 19 09:19:35.357575 master-0 kubenswrapper[3979]: E0319 09:19:35.357332 3979 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 19 09:19:35.364600 master-0 kubenswrapper[3979]: E0319 09:19:35.364144 3979 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 19 09:19:35.364600 master-0 kubenswrapper[3979]: E0319 09:19:35.364330 3979 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:19:35.366707 master-0 kubenswrapper[3979]: E0319 09:19:35.366648 3979 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 19 09:19:35.366769 master-0 kubenswrapper[3979]: E0319 09:19:35.366738 3979 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="nbdb" Mar 19 09:19:35.367715 master-0 kubenswrapper[3979]: E0319 09:19:35.367018 3979 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:19:35.368213 master-0 kubenswrapper[3979]: E0319 09:19:35.368172 3979 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:19:35.368265 master-0 kubenswrapper[3979]: E0319 09:19:35.368210 3979 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="sbdb" Mar 19 09:19:35.380590 master-0 kubenswrapper[3979]: I0319 09:19:35.380134 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" podStartSLOduration=8.345015396 podStartE2EDuration="39.380117513s" podCreationTimestamp="2026-03-19 09:18:56 +0000 UTC" firstStartedPulling="2026-03-19 09:18:57.92915503 +0000 UTC m=+112.972142608" lastFinishedPulling="2026-03-19 09:19:28.964257137 +0000 UTC m=+144.007244725" observedRunningTime="2026-03-19 09:19:35.380088783 +0000 UTC m=+150.423076371" watchObservedRunningTime="2026-03-19 09:19:35.380117513 +0000 UTC m=+150.423105081" Mar 19 09:19:35.380590 master-0 kubenswrapper[3979]: I0319 09:19:35.380373 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8kv6s" podStartSLOduration=9.418133686000001 podStartE2EDuration="52.38036918s" podCreationTimestamp="2026-03-19 09:18:43 +0000 UTC" firstStartedPulling="2026-03-19 09:18:45.053145848 +0000 UTC m=+100.096133426" lastFinishedPulling="2026-03-19 09:19:28.015381302 +0000 UTC m=+143.058368920" observedRunningTime="2026-03-19 09:19:31.898649807 +0000 UTC m=+146.941637405" watchObservedRunningTime="2026-03-19 09:19:35.38036918 +0000 UTC m=+150.423356758" Mar 19 09:19:35.382676 master-0 kubenswrapper[3979]: I0319 09:19:35.382026 3979 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="ovnkube-controller" containerID="cri-o://7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00" gracePeriod=30 Mar 19 09:19:35.785349 master-0 kubenswrapper[3979]: E0319 09:19:35.785277 3979 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:35.950047 master-0 kubenswrapper[3979]: I0319 09:19:35.949872 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfj77_e3ad145f-b791-4c0d-864a-8d7d6443f91a/ovnkube-controller/0.log" Mar 19 09:19:35.952399 master-0 kubenswrapper[3979]: I0319 09:19:35.952283 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfj77_e3ad145f-b791-4c0d-864a-8d7d6443f91a/kube-rbac-proxy-ovn-metrics/0.log" Mar 19 09:19:35.953322 master-0 kubenswrapper[3979]: I0319 09:19:35.953283 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfj77_e3ad145f-b791-4c0d-864a-8d7d6443f91a/kube-rbac-proxy-node/0.log" Mar 19 09:19:35.954340 master-0 kubenswrapper[3979]: I0319 09:19:35.954202 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfj77_e3ad145f-b791-4c0d-864a-8d7d6443f91a/ovn-acl-logging/0.log" Mar 19 09:19:35.955216 master-0 kubenswrapper[3979]: I0319 09:19:35.955098 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfj77_e3ad145f-b791-4c0d-864a-8d7d6443f91a/ovn-controller/0.log" Mar 19 09:19:35.956114 master-0 kubenswrapper[3979]: I0319 09:19:35.955844 3979 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: I0319 09:19:36.020927 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vcxjs"] Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: E0319 09:19:36.021113 3979 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="ovnkube-controller" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: I0319 09:19:36.021133 3979 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="ovnkube-controller" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: E0319 09:19:36.021143 3979 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="sbdb" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: I0319 09:19:36.021151 3979 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="sbdb" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: E0319 09:19:36.021161 3979 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="ovn-acl-logging" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: I0319 09:19:36.021169 3979 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="ovn-acl-logging" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: E0319 09:19:36.021177 3979 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="ovn-controller" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: I0319 09:19:36.021185 3979 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="ovn-controller" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: E0319 09:19:36.021193 3979 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="kube-rbac-proxy-node" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: I0319 09:19:36.021201 3979 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="kube-rbac-proxy-node" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: E0319 09:19:36.021210 3979 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="northd" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: I0319 09:19:36.021216 3979 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="northd" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: E0319 09:19:36.021226 3979 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: I0319 09:19:36.021233 3979 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: E0319 09:19:36.021242 3979 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="kubecfg-setup" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: I0319 09:19:36.021249 3979 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="kubecfg-setup" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: E0319 09:19:36.021260 3979 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="nbdb" Mar 19 09:19:36.021224 master-0 kubenswrapper[3979]: I0319 09:19:36.021271 3979 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="nbdb" Mar 19 09:19:36.023329 master-0 kubenswrapper[3979]: I0319 09:19:36.021320 3979 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 09:19:36.023329 master-0 kubenswrapper[3979]: I0319 09:19:36.021335 3979 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="ovn-controller" Mar 19 09:19:36.023329 master-0 kubenswrapper[3979]: I0319 09:19:36.021345 3979 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="kube-rbac-proxy-node" Mar 19 09:19:36.023329 master-0 kubenswrapper[3979]: I0319 09:19:36.021353 3979 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="sbdb" Mar 19 09:19:36.023329 master-0 kubenswrapper[3979]: I0319 09:19:36.021361 3979 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="northd" Mar 19 09:19:36.023329 master-0 kubenswrapper[3979]: I0319 09:19:36.021369 3979 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="ovn-acl-logging" Mar 19 09:19:36.023329 master-0 kubenswrapper[3979]: I0319 09:19:36.021377 3979 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="nbdb" Mar 19 09:19:36.023329 master-0 kubenswrapper[3979]: I0319 09:19:36.021384 3979 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerName="ovnkube-controller" Mar 19 09:19:36.023329 master-0 kubenswrapper[3979]: I0319 09:19:36.022425 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.073052 master-0 kubenswrapper[3979]: I0319 09:19:36.072859 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-var-lib-openvswitch\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.073052 master-0 kubenswrapper[3979]: I0319 09:19:36.072925 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovn-node-metrics-cert\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.073052 master-0 kubenswrapper[3979]: I0319 09:19:36.072953 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-env-overrides\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.073052 master-0 kubenswrapper[3979]: I0319 09:19:36.072983 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.073052 master-0 kubenswrapper[3979]: I0319 09:19:36.073006 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-node-log\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.073052 master-0 kubenswrapper[3979]: I0319 09:19:36.073039 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8szl\" (UniqueName: \"kubernetes.io/projected/e3ad145f-b791-4c0d-864a-8d7d6443f91a-kube-api-access-w8szl\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.073052 master-0 kubenswrapper[3979]: I0319 09:19:36.073061 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovnkube-config\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073084 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-run-ovn-kubernetes\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073108 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-systemd-units\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073127 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-run-netns\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073148 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-etc-openvswitch\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073171 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-ovn\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073190 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-log-socket\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073211 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-systemd\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073233 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovnkube-script-lib\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073255 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-kubelet\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073278 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-openvswitch\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073304 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-cni-bin\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073322 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-slash\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073341 3979 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-cni-netd\") pod \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\" (UID: \"e3ad145f-b791-4c0d-864a-8d7d6443f91a\") " Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073437 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073487 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073486 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073511 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.074274 master-0 kubenswrapper[3979]: I0319 09:19:36.073543 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073564 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073579 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073598 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-bin\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073610 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073621 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-etc-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073644 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073666 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-kubelet\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073684 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-systemd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073745 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-log-socket" (OuterVolumeSpecName: "log-socket") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073741 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073826 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073876 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073905 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-slash" (OuterVolumeSpecName: "host-slash") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073909 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-systemd-units\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075003 master-0 kubenswrapper[3979]: I0319 09:19:36.073937 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-slash\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.073914 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-node-log" (OuterVolumeSpecName: "node-log") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.073992 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074008 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-var-lib-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074067 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-netd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074097 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074118 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-script-lib\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074160 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-node-log\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074185 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-ovn\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074163 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074200 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074209 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-log-socket\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074209 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074322 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-config\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074359 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-netns\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074414 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-env-overrides\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075470 master-0 kubenswrapper[3979]: I0319 09:19:36.074465 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovn-node-metrics-cert\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074491 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvnb9\" (UniqueName: \"kubernetes.io/projected/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-kube-api-access-lvnb9\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074618 3979 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074636 3979 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074647 3979 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-slash\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074673 3979 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-env-overrides\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074685 3979 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074699 3979 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074710 3979 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-node-log\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074741 3979 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074754 3979 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-systemd-units\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074763 3979 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074773 3979 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074782 3979 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-run-netns\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074794 3979 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074804 3979 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-log-socket\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074817 3979 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074828 3979 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-host-kubelet\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.075953 master-0 kubenswrapper[3979]: I0319 09:19:36.074839 3979 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.077751 master-0 kubenswrapper[3979]: I0319 09:19:36.077678 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:19:36.078236 master-0 kubenswrapper[3979]: I0319 09:19:36.078181 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ad145f-b791-4c0d-864a-8d7d6443f91a-kube-api-access-w8szl" (OuterVolumeSpecName: "kube-api-access-w8szl") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "kube-api-access-w8szl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:19:36.080798 master-0 kubenswrapper[3979]: I0319 09:19:36.080750 3979 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e3ad145f-b791-4c0d-864a-8d7d6443f91a" (UID: "e3ad145f-b791-4c0d-864a-8d7d6443f91a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:36.176009 master-0 kubenswrapper[3979]: I0319 09:19:36.175891 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176009 master-0 kubenswrapper[3979]: I0319 09:19:36.175981 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-bin\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176009 master-0 kubenswrapper[3979]: I0319 09:19:36.176009 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-etc-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176451 master-0 kubenswrapper[3979]: I0319 09:19:36.176120 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176451 master-0 kubenswrapper[3979]: I0319 09:19:36.176228 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176451 master-0 kubenswrapper[3979]: I0319 09:19:36.176286 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-kubelet\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176451 master-0 kubenswrapper[3979]: I0319 09:19:36.176312 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-systemd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176451 master-0 kubenswrapper[3979]: I0319 09:19:36.176373 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-systemd-units\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176451 master-0 kubenswrapper[3979]: I0319 09:19:36.176399 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-slash\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176451 master-0 kubenswrapper[3979]: I0319 09:19:36.176444 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-var-lib-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176737 master-0 kubenswrapper[3979]: I0319 09:19:36.176487 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-script-lib\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176737 master-0 kubenswrapper[3979]: I0319 09:19:36.176519 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-netd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176737 master-0 kubenswrapper[3979]: I0319 09:19:36.176584 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176737 master-0 kubenswrapper[3979]: I0319 09:19:36.176611 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-node-log\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176737 master-0 kubenswrapper[3979]: I0319 09:19:36.176629 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-ovn\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176737 master-0 kubenswrapper[3979]: I0319 09:19:36.176698 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-systemd-units\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176737 master-0 kubenswrapper[3979]: I0319 09:19:36.176733 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-log-socket\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176950 master-0 kubenswrapper[3979]: I0319 09:19:36.176755 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-config\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176950 master-0 kubenswrapper[3979]: I0319 09:19:36.176779 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-netns\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176950 master-0 kubenswrapper[3979]: I0319 09:19:36.176776 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176950 master-0 kubenswrapper[3979]: I0319 09:19:36.176797 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-env-overrides\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176950 master-0 kubenswrapper[3979]: I0319 09:19:36.176816 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovn-node-metrics-cert\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176950 master-0 kubenswrapper[3979]: I0319 09:19:36.176835 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvnb9\" (UniqueName: \"kubernetes.io/projected/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-kube-api-access-lvnb9\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176950 master-0 kubenswrapper[3979]: I0319 09:19:36.176849 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-netd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176950 master-0 kubenswrapper[3979]: I0319 09:19:36.176874 3979 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3ad145f-b791-4c0d-864a-8d7d6443f91a-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.176950 master-0 kubenswrapper[3979]: I0319 09:19:36.176890 3979 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8szl\" (UniqueName: \"kubernetes.io/projected/e3ad145f-b791-4c0d-864a-8d7d6443f91a-kube-api-access-w8szl\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.176950 master-0 kubenswrapper[3979]: I0319 09:19:36.176890 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-node-log\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.176950 master-0 kubenswrapper[3979]: I0319 09:19:36.176902 3979 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3ad145f-b791-4c0d-864a-8d7d6443f91a-run-systemd\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:36.177370 master-0 kubenswrapper[3979]: I0319 09:19:36.177250 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-netns\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.177518 master-0 kubenswrapper[3979]: I0319 09:19:36.177479 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-script-lib\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.177518 master-0 kubenswrapper[3979]: I0319 09:19:36.177544 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-var-lib-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.177633 master-0 kubenswrapper[3979]: I0319 09:19:36.177566 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-kubelet\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.177633 master-0 kubenswrapper[3979]: I0319 09:19:36.177587 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-ovn\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.177633 master-0 kubenswrapper[3979]: I0319 09:19:36.177603 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-slash\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.177633 master-0 kubenswrapper[3979]: I0319 09:19:36.177626 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-bin\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.177765 master-0 kubenswrapper[3979]: I0319 09:19:36.177649 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-etc-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.177765 master-0 kubenswrapper[3979]: I0319 09:19:36.177670 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.177765 master-0 kubenswrapper[3979]: I0319 09:19:36.177690 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-systemd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.177898 master-0 kubenswrapper[3979]: I0319 09:19:36.177774 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-env-overrides\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.177898 master-0 kubenswrapper[3979]: I0319 09:19:36.177852 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-log-socket\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.178010 master-0 kubenswrapper[3979]: I0319 09:19:36.177935 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-config\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.180518 master-0 kubenswrapper[3979]: I0319 09:19:36.180472 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovn-node-metrics-cert\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.197222 master-0 kubenswrapper[3979]: I0319 09:19:36.197173 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvnb9\" (UniqueName: \"kubernetes.io/projected/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-kube-api-access-lvnb9\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.337166 master-0 kubenswrapper[3979]: I0319 09:19:36.336995 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:36.350627 master-0 kubenswrapper[3979]: W0319 09:19:36.350582 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9eb3750_cb7b_4d3c_88bc_d1b68a370872.slice/crio-1529b41a10d1658d384e0b7a36c11f0035fc8f768b5a9de54629908bbe77762e WatchSource:0}: Error finding container 1529b41a10d1658d384e0b7a36c11f0035fc8f768b5a9de54629908bbe77762e: Status 404 returned error can't find the container with id 1529b41a10d1658d384e0b7a36c11f0035fc8f768b5a9de54629908bbe77762e Mar 19 09:19:36.358814 master-0 kubenswrapper[3979]: I0319 09:19:36.358776 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfj77_e3ad145f-b791-4c0d-864a-8d7d6443f91a/ovnkube-controller/0.log" Mar 19 09:19:36.360366 master-0 kubenswrapper[3979]: I0319 09:19:36.360275 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfj77_e3ad145f-b791-4c0d-864a-8d7d6443f91a/kube-rbac-proxy-ovn-metrics/0.log" Mar 19 09:19:36.360780 master-0 kubenswrapper[3979]: I0319 09:19:36.360739 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfj77_e3ad145f-b791-4c0d-864a-8d7d6443f91a/kube-rbac-proxy-node/0.log" Mar 19 09:19:36.361193 master-0 kubenswrapper[3979]: I0319 09:19:36.361123 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfj77_e3ad145f-b791-4c0d-864a-8d7d6443f91a/ovn-acl-logging/0.log" Mar 19 09:19:36.361648 master-0 kubenswrapper[3979]: I0319 09:19:36.361621 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfj77_e3ad145f-b791-4c0d-864a-8d7d6443f91a/ovn-controller/0.log" Mar 19 09:19:36.362046 master-0 kubenswrapper[3979]: I0319 09:19:36.361971 3979 generic.go:334] "Generic (PLEG): container finished" podID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerID="7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00" exitCode=1 Mar 19 09:19:36.362046 master-0 kubenswrapper[3979]: I0319 09:19:36.362012 3979 generic.go:334] "Generic (PLEG): container finished" podID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerID="20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1" exitCode=0 Mar 19 09:19:36.362046 master-0 kubenswrapper[3979]: I0319 09:19:36.362020 3979 generic.go:334] "Generic (PLEG): container finished" podID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerID="0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf" exitCode=0 Mar 19 09:19:36.362046 master-0 kubenswrapper[3979]: I0319 09:19:36.362029 3979 generic.go:334] "Generic (PLEG): container finished" podID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerID="d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee" exitCode=0 Mar 19 09:19:36.362046 master-0 kubenswrapper[3979]: I0319 09:19:36.362036 3979 generic.go:334] "Generic (PLEG): container finished" podID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerID="097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090" exitCode=143 Mar 19 09:19:36.362046 master-0 kubenswrapper[3979]: I0319 09:19:36.362043 3979 generic.go:334] "Generic (PLEG): container finished" podID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerID="80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c" exitCode=143 Mar 19 09:19:36.362046 master-0 kubenswrapper[3979]: I0319 09:19:36.362051 3979 generic.go:334] "Generic (PLEG): container finished" podID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerID="c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c" exitCode=143 Mar 19 09:19:36.362046 master-0 kubenswrapper[3979]: I0319 09:19:36.362058 3979 generic.go:334] "Generic (PLEG): container finished" podID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" containerID="6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17" exitCode=143 Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362100 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerDied","Data":"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362129 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerDied","Data":"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362141 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerDied","Data":"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362149 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerDied","Data":"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362159 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerDied","Data":"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362167 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerDied","Data":"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362178 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362274 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362280 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362287 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerDied","Data":"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362295 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362301 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362307 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362312 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362317 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362322 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362327 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362335 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362339 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362346 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerDied","Data":"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362353 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362359 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362364 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362369 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362374 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090"} Mar 19 09:19:36.362659 master-0 kubenswrapper[3979]: I0319 09:19:36.362379 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362384 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362390 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362395 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362403 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" event={"ID":"e3ad145f-b791-4c0d-864a-8d7d6443f91a","Type":"ContainerDied","Data":"2724f078765cc41b21ea464b50fe169d860dc07093801eacc92a75b30e3593f5"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362411 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362418 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362423 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362428 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362433 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362438 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362443 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362448 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362453 3979 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6"} Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362469 3979 scope.go:117] "RemoveContainer" containerID="7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00" Mar 19 09:19:36.364103 master-0 kubenswrapper[3979]: I0319 09:19:36.362609 3979 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nfj77" Mar 19 09:19:36.369075 master-0 kubenswrapper[3979]: I0319 09:19:36.369013 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" event={"ID":"d9eb3750-cb7b-4d3c-88bc-d1b68a370872","Type":"ContainerStarted","Data":"1529b41a10d1658d384e0b7a36c11f0035fc8f768b5a9de54629908bbe77762e"} Mar 19 09:19:36.400166 master-0 kubenswrapper[3979]: I0319 09:19:36.400091 3979 scope.go:117] "RemoveContainer" containerID="20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1" Mar 19 09:19:36.411247 master-0 kubenswrapper[3979]: I0319 09:19:36.411151 3979 scope.go:117] "RemoveContainer" containerID="0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf" Mar 19 09:19:36.421223 master-0 kubenswrapper[3979]: I0319 09:19:36.421048 3979 scope.go:117] "RemoveContainer" containerID="d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee" Mar 19 09:19:36.421223 master-0 kubenswrapper[3979]: I0319 09:19:36.421260 3979 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nfj77"] Mar 19 09:19:36.433257 master-0 kubenswrapper[3979]: I0319 09:19:36.433145 3979 scope.go:117] "RemoveContainer" containerID="097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090" Mar 19 09:19:36.445162 master-0 kubenswrapper[3979]: I0319 09:19:36.441873 3979 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nfj77"] Mar 19 09:19:36.512867 master-0 kubenswrapper[3979]: I0319 09:19:36.512619 3979 scope.go:117] "RemoveContainer" containerID="80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c" Mar 19 09:19:36.525931 master-0 kubenswrapper[3979]: I0319 09:19:36.525885 3979 scope.go:117] "RemoveContainer" containerID="c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c" Mar 19 09:19:36.539259 master-0 kubenswrapper[3979]: I0319 09:19:36.538966 3979 scope.go:117] "RemoveContainer" containerID="6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17" Mar 19 09:19:36.550400 master-0 kubenswrapper[3979]: I0319 09:19:36.550357 3979 scope.go:117] "RemoveContainer" containerID="519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6" Mar 19 09:19:36.559756 master-0 kubenswrapper[3979]: I0319 09:19:36.559726 3979 scope.go:117] "RemoveContainer" containerID="7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00" Mar 19 09:19:36.560170 master-0 kubenswrapper[3979]: E0319 09:19:36.560135 3979 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00\": container with ID starting with 7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00 not found: ID does not exist" containerID="7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00" Mar 19 09:19:36.560216 master-0 kubenswrapper[3979]: I0319 09:19:36.560168 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00"} err="failed to get container status \"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00\": rpc error: code = NotFound desc = could not find container \"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00\": container with ID starting with 7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00 not found: ID does not exist" Mar 19 09:19:36.560216 master-0 kubenswrapper[3979]: I0319 09:19:36.560191 3979 scope.go:117] "RemoveContainer" containerID="20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1" Mar 19 09:19:36.560521 master-0 kubenswrapper[3979]: E0319 09:19:36.560480 3979 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1\": container with ID starting with 20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1 not found: ID does not exist" containerID="20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1" Mar 19 09:19:36.560597 master-0 kubenswrapper[3979]: I0319 09:19:36.560511 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1"} err="failed to get container status \"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1\": rpc error: code = NotFound desc = could not find container \"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1\": container with ID starting with 20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1 not found: ID does not exist" Mar 19 09:19:36.560597 master-0 kubenswrapper[3979]: I0319 09:19:36.560556 3979 scope.go:117] "RemoveContainer" containerID="0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf" Mar 19 09:19:36.560855 master-0 kubenswrapper[3979]: E0319 09:19:36.560823 3979 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf\": container with ID starting with 0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf not found: ID does not exist" containerID="0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf" Mar 19 09:19:36.560855 master-0 kubenswrapper[3979]: I0319 09:19:36.560848 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf"} err="failed to get container status \"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf\": rpc error: code = NotFound desc = could not find container \"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf\": container with ID starting with 0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf not found: ID does not exist" Mar 19 09:19:36.560958 master-0 kubenswrapper[3979]: I0319 09:19:36.560865 3979 scope.go:117] "RemoveContainer" containerID="d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee" Mar 19 09:19:36.561619 master-0 kubenswrapper[3979]: E0319 09:19:36.561491 3979 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee\": container with ID starting with d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee not found: ID does not exist" containerID="d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee" Mar 19 09:19:36.561619 master-0 kubenswrapper[3979]: I0319 09:19:36.561518 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee"} err="failed to get container status \"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee\": rpc error: code = NotFound desc = could not find container \"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee\": container with ID starting with d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee not found: ID does not exist" Mar 19 09:19:36.561619 master-0 kubenswrapper[3979]: I0319 09:19:36.561555 3979 scope.go:117] "RemoveContainer" containerID="097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090" Mar 19 09:19:36.561883 master-0 kubenswrapper[3979]: E0319 09:19:36.561852 3979 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090\": container with ID starting with 097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090 not found: ID does not exist" containerID="097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090" Mar 19 09:19:36.561883 master-0 kubenswrapper[3979]: I0319 09:19:36.561875 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090"} err="failed to get container status \"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090\": rpc error: code = NotFound desc = could not find container \"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090\": container with ID starting with 097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090 not found: ID does not exist" Mar 19 09:19:36.561980 master-0 kubenswrapper[3979]: I0319 09:19:36.561893 3979 scope.go:117] "RemoveContainer" containerID="80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c" Mar 19 09:19:36.562148 master-0 kubenswrapper[3979]: E0319 09:19:36.562111 3979 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c\": container with ID starting with 80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c not found: ID does not exist" containerID="80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c" Mar 19 09:19:36.562148 master-0 kubenswrapper[3979]: I0319 09:19:36.562140 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c"} err="failed to get container status \"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c\": rpc error: code = NotFound desc = could not find container \"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c\": container with ID starting with 80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c not found: ID does not exist" Mar 19 09:19:36.562238 master-0 kubenswrapper[3979]: I0319 09:19:36.562157 3979 scope.go:117] "RemoveContainer" containerID="c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c" Mar 19 09:19:36.562430 master-0 kubenswrapper[3979]: E0319 09:19:36.562401 3979 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c\": container with ID starting with c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c not found: ID does not exist" containerID="c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c" Mar 19 09:19:36.562430 master-0 kubenswrapper[3979]: I0319 09:19:36.562422 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c"} err="failed to get container status \"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c\": rpc error: code = NotFound desc = could not find container \"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c\": container with ID starting with c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c not found: ID does not exist" Mar 19 09:19:36.562539 master-0 kubenswrapper[3979]: I0319 09:19:36.562439 3979 scope.go:117] "RemoveContainer" containerID="6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17" Mar 19 09:19:36.562871 master-0 kubenswrapper[3979]: E0319 09:19:36.562839 3979 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17\": container with ID starting with 6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17 not found: ID does not exist" containerID="6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17" Mar 19 09:19:36.562939 master-0 kubenswrapper[3979]: I0319 09:19:36.562863 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17"} err="failed to get container status \"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17\": rpc error: code = NotFound desc = could not find container \"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17\": container with ID starting with 6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17 not found: ID does not exist" Mar 19 09:19:36.562939 master-0 kubenswrapper[3979]: I0319 09:19:36.562881 3979 scope.go:117] "RemoveContainer" containerID="519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6" Mar 19 09:19:36.563343 master-0 kubenswrapper[3979]: E0319 09:19:36.563312 3979 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6\": container with ID starting with 519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6 not found: ID does not exist" containerID="519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6" Mar 19 09:19:36.563399 master-0 kubenswrapper[3979]: I0319 09:19:36.563335 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6"} err="failed to get container status \"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6\": rpc error: code = NotFound desc = could not find container \"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6\": container with ID starting with 519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6 not found: ID does not exist" Mar 19 09:19:36.563399 master-0 kubenswrapper[3979]: I0319 09:19:36.563369 3979 scope.go:117] "RemoveContainer" containerID="7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00" Mar 19 09:19:36.563644 master-0 kubenswrapper[3979]: I0319 09:19:36.563614 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00"} err="failed to get container status \"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00\": rpc error: code = NotFound desc = could not find container \"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00\": container with ID starting with 7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00 not found: ID does not exist" Mar 19 09:19:36.563644 master-0 kubenswrapper[3979]: I0319 09:19:36.563635 3979 scope.go:117] "RemoveContainer" containerID="20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1" Mar 19 09:19:36.564002 master-0 kubenswrapper[3979]: I0319 09:19:36.563968 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1"} err="failed to get container status \"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1\": rpc error: code = NotFound desc = could not find container \"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1\": container with ID starting with 20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1 not found: ID does not exist" Mar 19 09:19:36.564002 master-0 kubenswrapper[3979]: I0319 09:19:36.563991 3979 scope.go:117] "RemoveContainer" containerID="0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf" Mar 19 09:19:36.564257 master-0 kubenswrapper[3979]: I0319 09:19:36.564216 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf"} err="failed to get container status \"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf\": rpc error: code = NotFound desc = could not find container \"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf\": container with ID starting with 0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf not found: ID does not exist" Mar 19 09:19:36.564336 master-0 kubenswrapper[3979]: I0319 09:19:36.564311 3979 scope.go:117] "RemoveContainer" containerID="d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee" Mar 19 09:19:36.564597 master-0 kubenswrapper[3979]: I0319 09:19:36.564569 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee"} err="failed to get container status \"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee\": rpc error: code = NotFound desc = could not find container \"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee\": container with ID starting with d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee not found: ID does not exist" Mar 19 09:19:36.564597 master-0 kubenswrapper[3979]: I0319 09:19:36.564589 3979 scope.go:117] "RemoveContainer" containerID="097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090" Mar 19 09:19:36.564916 master-0 kubenswrapper[3979]: I0319 09:19:36.564887 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090"} err="failed to get container status \"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090\": rpc error: code = NotFound desc = could not find container \"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090\": container with ID starting with 097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090 not found: ID does not exist" Mar 19 09:19:36.564916 master-0 kubenswrapper[3979]: I0319 09:19:36.564906 3979 scope.go:117] "RemoveContainer" containerID="80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c" Mar 19 09:19:36.565180 master-0 kubenswrapper[3979]: I0319 09:19:36.565153 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c"} err="failed to get container status \"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c\": rpc error: code = NotFound desc = could not find container \"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c\": container with ID starting with 80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c not found: ID does not exist" Mar 19 09:19:36.565180 master-0 kubenswrapper[3979]: I0319 09:19:36.565173 3979 scope.go:117] "RemoveContainer" containerID="c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c" Mar 19 09:19:36.565589 master-0 kubenswrapper[3979]: I0319 09:19:36.565557 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c"} err="failed to get container status \"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c\": rpc error: code = NotFound desc = could not find container \"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c\": container with ID starting with c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c not found: ID does not exist" Mar 19 09:19:36.565589 master-0 kubenswrapper[3979]: I0319 09:19:36.565586 3979 scope.go:117] "RemoveContainer" containerID="6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17" Mar 19 09:19:36.565833 master-0 kubenswrapper[3979]: I0319 09:19:36.565805 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17"} err="failed to get container status \"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17\": rpc error: code = NotFound desc = could not find container \"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17\": container with ID starting with 6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17 not found: ID does not exist" Mar 19 09:19:36.565876 master-0 kubenswrapper[3979]: I0319 09:19:36.565828 3979 scope.go:117] "RemoveContainer" containerID="519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6" Mar 19 09:19:36.566224 master-0 kubenswrapper[3979]: I0319 09:19:36.566188 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6"} err="failed to get container status \"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6\": rpc error: code = NotFound desc = could not find container \"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6\": container with ID starting with 519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6 not found: ID does not exist" Mar 19 09:19:36.566224 master-0 kubenswrapper[3979]: I0319 09:19:36.566214 3979 scope.go:117] "RemoveContainer" containerID="7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00" Mar 19 09:19:36.566447 master-0 kubenswrapper[3979]: I0319 09:19:36.566416 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00"} err="failed to get container status \"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00\": rpc error: code = NotFound desc = could not find container \"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00\": container with ID starting with 7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00 not found: ID does not exist" Mar 19 09:19:36.566447 master-0 kubenswrapper[3979]: I0319 09:19:36.566437 3979 scope.go:117] "RemoveContainer" containerID="20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1" Mar 19 09:19:36.566738 master-0 kubenswrapper[3979]: I0319 09:19:36.566695 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1"} err="failed to get container status \"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1\": rpc error: code = NotFound desc = could not find container \"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1\": container with ID starting with 20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1 not found: ID does not exist" Mar 19 09:19:36.566738 master-0 kubenswrapper[3979]: I0319 09:19:36.566730 3979 scope.go:117] "RemoveContainer" containerID="0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf" Mar 19 09:19:36.566999 master-0 kubenswrapper[3979]: I0319 09:19:36.566969 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf"} err="failed to get container status \"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf\": rpc error: code = NotFound desc = could not find container \"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf\": container with ID starting with 0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf not found: ID does not exist" Mar 19 09:19:36.566999 master-0 kubenswrapper[3979]: I0319 09:19:36.566987 3979 scope.go:117] "RemoveContainer" containerID="d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee" Mar 19 09:19:36.567219 master-0 kubenswrapper[3979]: I0319 09:19:36.567190 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee"} err="failed to get container status \"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee\": rpc error: code = NotFound desc = could not find container \"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee\": container with ID starting with d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee not found: ID does not exist" Mar 19 09:19:36.567219 master-0 kubenswrapper[3979]: I0319 09:19:36.567218 3979 scope.go:117] "RemoveContainer" containerID="097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090" Mar 19 09:19:36.567452 master-0 kubenswrapper[3979]: I0319 09:19:36.567420 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090"} err="failed to get container status \"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090\": rpc error: code = NotFound desc = could not find container \"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090\": container with ID starting with 097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090 not found: ID does not exist" Mar 19 09:19:36.567510 master-0 kubenswrapper[3979]: I0319 09:19:36.567452 3979 scope.go:117] "RemoveContainer" containerID="80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c" Mar 19 09:19:36.567723 master-0 kubenswrapper[3979]: I0319 09:19:36.567702 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c"} err="failed to get container status \"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c\": rpc error: code = NotFound desc = could not find container \"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c\": container with ID starting with 80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c not found: ID does not exist" Mar 19 09:19:36.567766 master-0 kubenswrapper[3979]: I0319 09:19:36.567722 3979 scope.go:117] "RemoveContainer" containerID="c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c" Mar 19 09:19:36.568070 master-0 kubenswrapper[3979]: I0319 09:19:36.568032 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c"} err="failed to get container status \"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c\": rpc error: code = NotFound desc = could not find container \"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c\": container with ID starting with c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c not found: ID does not exist" Mar 19 09:19:36.568070 master-0 kubenswrapper[3979]: I0319 09:19:36.568060 3979 scope.go:117] "RemoveContainer" containerID="6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17" Mar 19 09:19:36.568303 master-0 kubenswrapper[3979]: I0319 09:19:36.568269 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17"} err="failed to get container status \"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17\": rpc error: code = NotFound desc = could not find container \"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17\": container with ID starting with 6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17 not found: ID does not exist" Mar 19 09:19:36.568303 master-0 kubenswrapper[3979]: I0319 09:19:36.568292 3979 scope.go:117] "RemoveContainer" containerID="519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6" Mar 19 09:19:36.568582 master-0 kubenswrapper[3979]: I0319 09:19:36.568549 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6"} err="failed to get container status \"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6\": rpc error: code = NotFound desc = could not find container \"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6\": container with ID starting with 519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6 not found: ID does not exist" Mar 19 09:19:36.568582 master-0 kubenswrapper[3979]: I0319 09:19:36.568575 3979 scope.go:117] "RemoveContainer" containerID="7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00" Mar 19 09:19:36.568919 master-0 kubenswrapper[3979]: I0319 09:19:36.568888 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00"} err="failed to get container status \"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00\": rpc error: code = NotFound desc = could not find container \"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00\": container with ID starting with 7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00 not found: ID does not exist" Mar 19 09:19:36.568919 master-0 kubenswrapper[3979]: I0319 09:19:36.568916 3979 scope.go:117] "RemoveContainer" containerID="20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1" Mar 19 09:19:36.569279 master-0 kubenswrapper[3979]: I0319 09:19:36.569254 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1"} err="failed to get container status \"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1\": rpc error: code = NotFound desc = could not find container \"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1\": container with ID starting with 20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1 not found: ID does not exist" Mar 19 09:19:36.569279 master-0 kubenswrapper[3979]: I0319 09:19:36.569272 3979 scope.go:117] "RemoveContainer" containerID="0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf" Mar 19 09:19:36.569551 master-0 kubenswrapper[3979]: I0319 09:19:36.569515 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf"} err="failed to get container status \"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf\": rpc error: code = NotFound desc = could not find container \"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf\": container with ID starting with 0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf not found: ID does not exist" Mar 19 09:19:36.569595 master-0 kubenswrapper[3979]: I0319 09:19:36.569553 3979 scope.go:117] "RemoveContainer" containerID="d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee" Mar 19 09:19:36.569803 master-0 kubenswrapper[3979]: I0319 09:19:36.569778 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee"} err="failed to get container status \"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee\": rpc error: code = NotFound desc = could not find container \"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee\": container with ID starting with d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee not found: ID does not exist" Mar 19 09:19:36.569837 master-0 kubenswrapper[3979]: I0319 09:19:36.569809 3979 scope.go:117] "RemoveContainer" containerID="097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090" Mar 19 09:19:36.570238 master-0 kubenswrapper[3979]: I0319 09:19:36.570203 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090"} err="failed to get container status \"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090\": rpc error: code = NotFound desc = could not find container \"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090\": container with ID starting with 097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090 not found: ID does not exist" Mar 19 09:19:36.570238 master-0 kubenswrapper[3979]: I0319 09:19:36.570232 3979 scope.go:117] "RemoveContainer" containerID="80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c" Mar 19 09:19:36.570474 master-0 kubenswrapper[3979]: I0319 09:19:36.570443 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c"} err="failed to get container status \"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c\": rpc error: code = NotFound desc = could not find container \"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c\": container with ID starting with 80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c not found: ID does not exist" Mar 19 09:19:36.570552 master-0 kubenswrapper[3979]: I0319 09:19:36.570464 3979 scope.go:117] "RemoveContainer" containerID="c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c" Mar 19 09:19:36.570901 master-0 kubenswrapper[3979]: I0319 09:19:36.570871 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c"} err="failed to get container status \"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c\": rpc error: code = NotFound desc = could not find container \"c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c\": container with ID starting with c90936bb0286bfb1861a19df2b6049e8a2a2fa79474dcd2cb29a3d60e500f09c not found: ID does not exist" Mar 19 09:19:36.570901 master-0 kubenswrapper[3979]: I0319 09:19:36.570890 3979 scope.go:117] "RemoveContainer" containerID="6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17" Mar 19 09:19:36.571295 master-0 kubenswrapper[3979]: I0319 09:19:36.571233 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17"} err="failed to get container status \"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17\": rpc error: code = NotFound desc = could not find container \"6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17\": container with ID starting with 6147bd7ead9b420cf385cb4e3bcaf6bf6892de46a6eb8e4271ca1f747e8cde17 not found: ID does not exist" Mar 19 09:19:36.571345 master-0 kubenswrapper[3979]: I0319 09:19:36.571293 3979 scope.go:117] "RemoveContainer" containerID="519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6" Mar 19 09:19:36.571762 master-0 kubenswrapper[3979]: I0319 09:19:36.571728 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6"} err="failed to get container status \"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6\": rpc error: code = NotFound desc = could not find container \"519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6\": container with ID starting with 519e673fbc7feff2a2ab053b2509e6d1a2b2e40dd93d46fa5ab5bfe99f0c74f6 not found: ID does not exist" Mar 19 09:19:36.571762 master-0 kubenswrapper[3979]: I0319 09:19:36.571750 3979 scope.go:117] "RemoveContainer" containerID="7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00" Mar 19 09:19:36.572044 master-0 kubenswrapper[3979]: I0319 09:19:36.572000 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00"} err="failed to get container status \"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00\": rpc error: code = NotFound desc = could not find container \"7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00\": container with ID starting with 7936a84ff214badfe65bf749023087cdd912c088f066e231b7eea15963512e00 not found: ID does not exist" Mar 19 09:19:36.572044 master-0 kubenswrapper[3979]: I0319 09:19:36.572031 3979 scope.go:117] "RemoveContainer" containerID="20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1" Mar 19 09:19:36.572344 master-0 kubenswrapper[3979]: I0319 09:19:36.572312 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1"} err="failed to get container status \"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1\": rpc error: code = NotFound desc = could not find container \"20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1\": container with ID starting with 20cd2e0d3a954c4f2500e5585abd704b3c551544d217cd7046874d21fff24da1 not found: ID does not exist" Mar 19 09:19:36.572344 master-0 kubenswrapper[3979]: I0319 09:19:36.572332 3979 scope.go:117] "RemoveContainer" containerID="0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf" Mar 19 09:19:36.572754 master-0 kubenswrapper[3979]: I0319 09:19:36.572721 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf"} err="failed to get container status \"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf\": rpc error: code = NotFound desc = could not find container \"0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf\": container with ID starting with 0e58cbebee5cd3835775c255534f469c3d7c3d100e94df08f66ea91e03573ecf not found: ID does not exist" Mar 19 09:19:36.572754 master-0 kubenswrapper[3979]: I0319 09:19:36.572741 3979 scope.go:117] "RemoveContainer" containerID="d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee" Mar 19 09:19:36.573399 master-0 kubenswrapper[3979]: I0319 09:19:36.573341 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee"} err="failed to get container status \"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee\": rpc error: code = NotFound desc = could not find container \"d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee\": container with ID starting with d753531b6a10abc67b51acf85ad75568de257e53b1b7e800d7628617fa746aee not found: ID does not exist" Mar 19 09:19:36.573399 master-0 kubenswrapper[3979]: I0319 09:19:36.573389 3979 scope.go:117] "RemoveContainer" containerID="097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090" Mar 19 09:19:36.573759 master-0 kubenswrapper[3979]: I0319 09:19:36.573730 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090"} err="failed to get container status \"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090\": rpc error: code = NotFound desc = could not find container \"097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090\": container with ID starting with 097d8456699f2a42a3f3edf83cb34b0086ba1f76ce18398453ead3ab76ad8090 not found: ID does not exist" Mar 19 09:19:36.573815 master-0 kubenswrapper[3979]: I0319 09:19:36.573767 3979 scope.go:117] "RemoveContainer" containerID="80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c" Mar 19 09:19:36.574045 master-0 kubenswrapper[3979]: I0319 09:19:36.574009 3979 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c"} err="failed to get container status \"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c\": rpc error: code = NotFound desc = could not find container \"80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c\": container with ID starting with 80729dcb8f44a8d81ed2fb60256a4f68f47931131ccf478d5eb9aa1839598d2c not found: ID does not exist" Mar 19 09:19:36.782743 master-0 kubenswrapper[3979]: I0319 09:19:36.782286 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:36.782743 master-0 kubenswrapper[3979]: I0319 09:19:36.782321 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:36.782743 master-0 kubenswrapper[3979]: E0319 09:19:36.782460 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:36.782743 master-0 kubenswrapper[3979]: E0319 09:19:36.782665 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:37.374852 master-0 kubenswrapper[3979]: I0319 09:19:37.374779 3979 generic.go:334] "Generic (PLEG): container finished" podID="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" containerID="0ce27311ef590bbffcd62b67c2b6ee4f6f31b7ee4bc36c74deac775d99e52498" exitCode=0 Mar 19 09:19:37.374852 master-0 kubenswrapper[3979]: I0319 09:19:37.374843 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" event={"ID":"d9eb3750-cb7b-4d3c-88bc-d1b68a370872","Type":"ContainerDied","Data":"0ce27311ef590bbffcd62b67c2b6ee4f6f31b7ee4bc36c74deac775d99e52498"} Mar 19 09:19:37.788002 master-0 kubenswrapper[3979]: I0319 09:19:37.787595 3979 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ad145f-b791-4c0d-864a-8d7d6443f91a" path="/var/lib/kubelet/pods/e3ad145f-b791-4c0d-864a-8d7d6443f91a/volumes" Mar 19 09:19:38.382518 master-0 kubenswrapper[3979]: I0319 09:19:38.382476 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" event={"ID":"d9eb3750-cb7b-4d3c-88bc-d1b68a370872","Type":"ContainerStarted","Data":"bdd16c41d28bec21c3c67ae08b67b05b98e9ed95c3ebb9f2d83f1284a787eddf"} Mar 19 09:19:38.382518 master-0 kubenswrapper[3979]: I0319 09:19:38.382517 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" event={"ID":"d9eb3750-cb7b-4d3c-88bc-d1b68a370872","Type":"ContainerStarted","Data":"0a9db4c837d83b07b82a0e4ae8fa3161a402e3f8d637a8b5527bb2a2570a2397"} Mar 19 09:19:38.383052 master-0 kubenswrapper[3979]: I0319 09:19:38.382563 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" event={"ID":"d9eb3750-cb7b-4d3c-88bc-d1b68a370872","Type":"ContainerStarted","Data":"5388bad089c8a21f529df7e99d49b7a8cc9ce837a5ca026c5985fab5c1c333f3"} Mar 19 09:19:38.383052 master-0 kubenswrapper[3979]: I0319 09:19:38.382580 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" event={"ID":"d9eb3750-cb7b-4d3c-88bc-d1b68a370872","Type":"ContainerStarted","Data":"323b453c0f4318307c418b1b26a5e463a6ba3ba5e2354b748d33899792048aeb"} Mar 19 09:19:38.383052 master-0 kubenswrapper[3979]: I0319 09:19:38.382592 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" event={"ID":"d9eb3750-cb7b-4d3c-88bc-d1b68a370872","Type":"ContainerStarted","Data":"a0bec00b30ac7da6455a8b0edc18659b408bb0f098aa1cb613587655dc2af943"} Mar 19 09:19:38.383052 master-0 kubenswrapper[3979]: I0319 09:19:38.382601 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" event={"ID":"d9eb3750-cb7b-4d3c-88bc-d1b68a370872","Type":"ContainerStarted","Data":"e74fa45ef44ad2c8dfce9ea49447f631c091cf0f2160e5ae98af28b1dc0f6a0e"} Mar 19 09:19:38.782686 master-0 kubenswrapper[3979]: I0319 09:19:38.782639 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:38.783101 master-0 kubenswrapper[3979]: I0319 09:19:38.782776 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:38.783228 master-0 kubenswrapper[3979]: E0319 09:19:38.783046 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:38.783330 master-0 kubenswrapper[3979]: E0319 09:19:38.783276 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:40.783005 master-0 kubenswrapper[3979]: I0319 09:19:40.782901 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:40.783005 master-0 kubenswrapper[3979]: I0319 09:19:40.782901 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:40.784059 master-0 kubenswrapper[3979]: E0319 09:19:40.783133 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:40.784059 master-0 kubenswrapper[3979]: E0319 09:19:40.783308 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:40.786971 master-0 kubenswrapper[3979]: E0319 09:19:40.786908 3979 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:41.397485 master-0 kubenswrapper[3979]: I0319 09:19:41.397397 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" event={"ID":"d9eb3750-cb7b-4d3c-88bc-d1b68a370872","Type":"ContainerStarted","Data":"cd737c6dc90e3ddeede98041e1d3bd6572a07823548f791a03b304ca8eeb042b"} Mar 19 09:19:42.782827 master-0 kubenswrapper[3979]: I0319 09:19:42.782414 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:42.782827 master-0 kubenswrapper[3979]: I0319 09:19:42.782429 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:42.782827 master-0 kubenswrapper[3979]: E0319 09:19:42.782643 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:42.782827 master-0 kubenswrapper[3979]: E0319 09:19:42.782697 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:43.412500 master-0 kubenswrapper[3979]: I0319 09:19:43.411938 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" event={"ID":"d9eb3750-cb7b-4d3c-88bc-d1b68a370872","Type":"ContainerStarted","Data":"f783da4cf16da7de3e504c860ca329b7d6bbc790d50292cbb30a3bfa6e5336d2"} Mar 19 09:19:43.412707 master-0 kubenswrapper[3979]: I0319 09:19:43.412513 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:43.412707 master-0 kubenswrapper[3979]: I0319 09:19:43.412546 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:43.412707 master-0 kubenswrapper[3979]: I0319 09:19:43.412558 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:43.437514 master-0 kubenswrapper[3979]: I0319 09:19:43.437467 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:43.438356 master-0 kubenswrapper[3979]: I0319 09:19:43.438333 3979 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:19:44.258235 master-0 kubenswrapper[3979]: I0319 09:19:44.258156 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" podStartSLOduration=9.258136849 podStartE2EDuration="9.258136849s" podCreationTimestamp="2026-03-19 09:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:19:44.111837276 +0000 UTC m=+159.154824884" watchObservedRunningTime="2026-03-19 09:19:44.258136849 +0000 UTC m=+159.301124427" Mar 19 09:19:44.416135 master-0 kubenswrapper[3979]: I0319 09:19:44.416033 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzdzd_157e3524-eb27-41ca-b49d-2697ee1245ca/kube-multus/0.log" Mar 19 09:19:44.416135 master-0 kubenswrapper[3979]: I0319 09:19:44.416092 3979 generic.go:334] "Generic (PLEG): container finished" podID="157e3524-eb27-41ca-b49d-2697ee1245ca" containerID="2d3477c3a9725b873c8e5413ca72191db0e07b17ecaa8a6d3f792473fd194137" exitCode=1 Mar 19 09:19:44.416382 master-0 kubenswrapper[3979]: I0319 09:19:44.416189 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bzdzd" event={"ID":"157e3524-eb27-41ca-b49d-2697ee1245ca","Type":"ContainerDied","Data":"2d3477c3a9725b873c8e5413ca72191db0e07b17ecaa8a6d3f792473fd194137"} Mar 19 09:19:44.416762 master-0 kubenswrapper[3979]: I0319 09:19:44.416734 3979 scope.go:117] "RemoveContainer" containerID="2d3477c3a9725b873c8e5413ca72191db0e07b17ecaa8a6d3f792473fd194137" Mar 19 09:19:44.770877 master-0 kubenswrapper[3979]: I0319 09:19:44.770841 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4s5vc"] Mar 19 09:19:44.770992 master-0 kubenswrapper[3979]: I0319 09:19:44.770959 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:44.771073 master-0 kubenswrapper[3979]: E0319 09:19:44.771048 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:44.782581 master-0 kubenswrapper[3979]: I0319 09:19:44.782486 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:44.782754 master-0 kubenswrapper[3979]: E0319 09:19:44.782639 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:44.838647 master-0 kubenswrapper[3979]: I0319 09:19:44.837415 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nq9vs"] Mar 19 09:19:45.424076 master-0 kubenswrapper[3979]: I0319 09:19:45.423971 3979 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzdzd_157e3524-eb27-41ca-b49d-2697ee1245ca/kube-multus/0.log" Mar 19 09:19:45.424802 master-0 kubenswrapper[3979]: I0319 09:19:45.424113 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bzdzd" event={"ID":"157e3524-eb27-41ca-b49d-2697ee1245ca","Type":"ContainerStarted","Data":"4fee7c5347449c9562d0dcbc7477c8cc9cdd44af65fa32077143285ea4e0db10"} Mar 19 09:19:45.425378 master-0 kubenswrapper[3979]: I0319 09:19:45.425295 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:45.425492 master-0 kubenswrapper[3979]: E0319 09:19:45.425413 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:45.792264 master-0 kubenswrapper[3979]: E0319 09:19:45.791996 3979 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:45.922616 master-0 kubenswrapper[3979]: I0319 09:19:45.922573 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 09:19:46.782993 master-0 kubenswrapper[3979]: I0319 09:19:46.782913 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:46.783907 master-0 kubenswrapper[3979]: E0319 09:19:46.783034 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:47.782382 master-0 kubenswrapper[3979]: I0319 09:19:47.782286 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:47.782606 master-0 kubenswrapper[3979]: E0319 09:19:47.782490 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:48.782162 master-0 kubenswrapper[3979]: I0319 09:19:48.782062 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:48.783183 master-0 kubenswrapper[3979]: E0319 09:19:48.782216 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:48.910077 master-0 kubenswrapper[3979]: I0319 09:19:48.909988 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:48.910363 master-0 kubenswrapper[3979]: E0319 09:19:48.910204 3979 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:19:48.910363 master-0 kubenswrapper[3979]: E0319 09:19:48.910325 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:52.910298844 +0000 UTC m=+227.953286452 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:19:49.783226 master-0 kubenswrapper[3979]: I0319 09:19:49.783130 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:49.784032 master-0 kubenswrapper[3979]: E0319 09:19:49.783486 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nq9vs" podUID="13072c08-c77c-4170-9ebe-98d63968747b" Mar 19 09:19:50.782597 master-0 kubenswrapper[3979]: I0319 09:19:50.782443 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:50.783013 master-0 kubenswrapper[3979]: E0319 09:19:50.782621 3979 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4s5vc" podUID="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" Mar 19 09:19:51.129237 master-0 kubenswrapper[3979]: I0319 09:19:51.129079 3979 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Mar 19 09:19:51.782652 master-0 kubenswrapper[3979]: I0319 09:19:51.782508 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:19:51.785066 master-0 kubenswrapper[3979]: I0319 09:19:51.785014 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:19:52.782814 master-0 kubenswrapper[3979]: I0319 09:19:52.782757 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:19:52.784589 master-0 kubenswrapper[3979]: I0319 09:19:52.784505 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:19:52.784861 master-0 kubenswrapper[3979]: I0319 09:19:52.784823 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:19:53.081373 master-0 kubenswrapper[3979]: I0319 09:19:53.081264 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd"] Mar 19 09:19:53.081654 master-0 kubenswrapper[3979]: I0319 09:19:53.081624 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:19:53.082677 master-0 kubenswrapper[3979]: I0319 09:19:53.082636 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb"] Mar 19 09:19:53.083062 master-0 kubenswrapper[3979]: I0319 09:19:53.083035 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.084145 master-0 kubenswrapper[3979]: I0319 09:19:53.084121 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:19:53.084219 master-0 kubenswrapper[3979]: I0319 09:19:53.084198 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:19:53.085290 master-0 kubenswrapper[3979]: I0319 09:19:53.085081 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 09:19:53.085290 master-0 kubenswrapper[3979]: I0319 09:19:53.085215 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:19:53.092649 master-0 kubenswrapper[3979]: I0319 09:19:53.085433 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 09:19:53.092649 master-0 kubenswrapper[3979]: I0319 09:19:53.085669 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 09:19:53.092649 master-0 kubenswrapper[3979]: I0319 09:19:53.088771 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:19:53.097852 master-0 kubenswrapper[3979]: I0319 09:19:53.093293 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 09:19:53.097852 master-0 kubenswrapper[3979]: I0319 09:19:53.095214 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9"] Mar 19 09:19:53.104478 master-0 kubenswrapper[3979]: I0319 09:19:53.104377 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:19:53.105138 master-0 kubenswrapper[3979]: I0319 09:19:53.105096 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6"] Mar 19 09:19:53.105897 master-0 kubenswrapper[3979]: I0319 09:19:53.105601 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:19:53.105897 master-0 kubenswrapper[3979]: I0319 09:19:53.105659 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 09:19:53.105897 master-0 kubenswrapper[3979]: I0319 09:19:53.105695 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6"] Mar 19 09:19:53.106393 master-0 kubenswrapper[3979]: I0319 09:19:53.106322 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:19:53.107243 master-0 kubenswrapper[3979]: I0319 09:19:53.106609 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.107243 master-0 kubenswrapper[3979]: I0319 09:19:53.107109 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:19:53.109642 master-0 kubenswrapper[3979]: I0319 09:19:53.108719 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl"] Mar 19 09:19:53.109642 master-0 kubenswrapper[3979]: I0319 09:19:53.109201 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:19:53.110716 master-0 kubenswrapper[3979]: I0319 09:19:53.110659 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:19:53.110880 master-0 kubenswrapper[3979]: I0319 09:19:53.110747 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898"] Mar 19 09:19:53.110999 master-0 kubenswrapper[3979]: I0319 09:19:53.110956 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:19:53.113193 master-0 kubenswrapper[3979]: I0319 09:19:53.112148 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:19:53.113193 master-0 kubenswrapper[3979]: I0319 09:19:53.112170 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:19:53.113193 master-0 kubenswrapper[3979]: I0319 09:19:53.112169 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:19:53.113193 master-0 kubenswrapper[3979]: I0319 09:19:53.112224 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:19:53.113193 master-0 kubenswrapper[3979]: I0319 09:19:53.112830 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 09:19:53.113193 master-0 kubenswrapper[3979]: I0319 09:19:53.112882 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:19:53.113193 master-0 kubenswrapper[3979]: I0319 09:19:53.112905 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 09:19:53.113736 master-0 kubenswrapper[3979]: I0319 09:19:53.113392 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 09:19:53.114800 master-0 kubenswrapper[3979]: I0319 09:19:53.114738 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:19:53.114938 master-0 kubenswrapper[3979]: I0319 09:19:53.114884 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-gxznr"] Mar 19 09:19:53.115144 master-0 kubenswrapper[3979]: I0319 09:19:53.115081 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:19:53.115359 master-0 kubenswrapper[3979]: I0319 09:19:53.115276 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:19:53.115359 master-0 kubenswrapper[3979]: I0319 09:19:53.115324 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:19:53.115840 master-0 kubenswrapper[3979]: I0319 09:19:53.115786 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:19:53.116088 master-0 kubenswrapper[3979]: I0319 09:19:53.116056 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:19:53.117735 master-0 kubenswrapper[3979]: I0319 09:19:53.117688 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:19:53.118680 master-0 kubenswrapper[3979]: I0319 09:19:53.118143 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:19:53.118680 master-0 kubenswrapper[3979]: I0319 09:19:53.118455 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:19:53.118808 master-0 kubenswrapper[3979]: I0319 09:19:53.118715 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:19:53.119443 master-0 kubenswrapper[3979]: I0319 09:19:53.119397 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-cbw4r"] Mar 19 09:19:53.119904 master-0 kubenswrapper[3979]: I0319 09:19:53.119873 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd"] Mar 19 09:19:53.120239 master-0 kubenswrapper[3979]: I0319 09:19:53.120206 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.120348 master-0 kubenswrapper[3979]: I0319 09:19:53.120302 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:19:53.120348 master-0 kubenswrapper[3979]: I0319 09:19:53.120343 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk"] Mar 19 09:19:53.122436 master-0 kubenswrapper[3979]: I0319 09:19:53.122384 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9"] Mar 19 09:19:53.122665 master-0 kubenswrapper[3979]: I0319 09:19:53.122602 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.123600 master-0 kubenswrapper[3979]: I0319 09:19:53.123572 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:19:53.123600 master-0 kubenswrapper[3979]: I0319 09:19:53.123581 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:19:53.123704 master-0 kubenswrapper[3979]: I0319 09:19:53.123586 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:19:53.123818 master-0 kubenswrapper[3979]: I0319 09:19:53.123787 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:19:53.123871 master-0 kubenswrapper[3979]: I0319 09:19:53.123836 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:19:53.123912 master-0 kubenswrapper[3979]: I0319 09:19:53.123854 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:19:53.125987 master-0 kubenswrapper[3979]: I0319 09:19:53.125935 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:19:53.126180 master-0 kubenswrapper[3979]: I0319 09:19:53.126142 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:19:53.126462 master-0 kubenswrapper[3979]: I0319 09:19:53.126423 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:19:53.126666 master-0 kubenswrapper[3979]: I0319 09:19:53.126629 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:19:53.126914 master-0 kubenswrapper[3979]: I0319 09:19:53.126869 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692"] Mar 19 09:19:53.127224 master-0 kubenswrapper[3979]: I0319 09:19:53.127185 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:19:53.127329 master-0 kubenswrapper[3979]: I0319 09:19:53.127216 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:19:53.128122 master-0 kubenswrapper[3979]: I0319 09:19:53.128052 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh"] Mar 19 09:19:53.128504 master-0 kubenswrapper[3979]: I0319 09:19:53.128453 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:19:53.128873 master-0 kubenswrapper[3979]: I0319 09:19:53.128788 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7"] Mar 19 09:19:53.129296 master-0 kubenswrapper[3979]: I0319 09:19:53.129235 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5"] Mar 19 09:19:53.129612 master-0 kubenswrapper[3979]: I0319 09:19:53.129548 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:19:53.129804 master-0 kubenswrapper[3979]: I0319 09:19:53.129762 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59"] Mar 19 09:19:53.129969 master-0 kubenswrapper[3979]: I0319 09:19:53.129933 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.130099 master-0 kubenswrapper[3979]: I0319 09:19:53.130064 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:19:53.130372 master-0 kubenswrapper[3979]: I0319 09:19:53.130310 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj"] Mar 19 09:19:53.130695 master-0 kubenswrapper[3979]: I0319 09:19:53.130644 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:19:53.130836 master-0 kubenswrapper[3979]: I0319 09:19:53.130721 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:19:53.134637 master-0 kubenswrapper[3979]: I0319 09:19:53.134455 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd"] Mar 19 09:19:53.135010 master-0 kubenswrapper[3979]: I0319 09:19:53.134972 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:19:53.136320 master-0 kubenswrapper[3979]: I0319 09:19:53.136267 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:19:53.138176 master-0 kubenswrapper[3979]: I0319 09:19:53.138126 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:19:53.138303 master-0 kubenswrapper[3979]: I0319 09:19:53.138188 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:19:53.138440 master-0 kubenswrapper[3979]: I0319 09:19:53.138393 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:19:53.139349 master-0 kubenswrapper[3979]: I0319 09:19:53.139314 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:19:53.139349 master-0 kubenswrapper[3979]: I0319 09:19:53.139339 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d"] Mar 19 09:19:53.139621 master-0 kubenswrapper[3979]: I0319 09:19:53.139377 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:19:53.139621 master-0 kubenswrapper[3979]: I0319 09:19:53.139466 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:19:53.139621 master-0 kubenswrapper[3979]: I0319 09:19:53.139467 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:19:53.139621 master-0 kubenswrapper[3979]: I0319 09:19:53.139506 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:19:53.139621 master-0 kubenswrapper[3979]: I0319 09:19:53.139598 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:19:53.140022 master-0 kubenswrapper[3979]: I0319 09:19:53.139680 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:19:53.140022 master-0 kubenswrapper[3979]: I0319 09:19:53.139705 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:19:53.140022 master-0 kubenswrapper[3979]: I0319 09:19:53.139841 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:19:53.140282 master-0 kubenswrapper[3979]: I0319 09:19:53.140148 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:19:53.140599 master-0 kubenswrapper[3979]: I0319 09:19:53.140566 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:19:53.140731 master-0 kubenswrapper[3979]: I0319 09:19:53.140690 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm"] Mar 19 09:19:53.140815 master-0 kubenswrapper[3979]: I0319 09:19:53.140785 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:19:53.140893 master-0 kubenswrapper[3979]: I0319 09:19:53.140848 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:19:53.140982 master-0 kubenswrapper[3979]: I0319 09:19:53.140786 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:19:53.140982 master-0 kubenswrapper[3979]: I0319 09:19:53.140943 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:19:53.141169 master-0 kubenswrapper[3979]: I0319 09:19:53.140927 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:19:53.141169 master-0 kubenswrapper[3979]: I0319 09:19:53.141005 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:19:53.141169 master-0 kubenswrapper[3979]: I0319 09:19:53.141097 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm" Mar 19 09:19:53.141420 master-0 kubenswrapper[3979]: I0319 09:19:53.141183 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:19:53.141420 master-0 kubenswrapper[3979]: I0319 09:19:53.141222 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:19:53.141420 master-0 kubenswrapper[3979]: I0319 09:19:53.141242 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:19:53.142636 master-0 kubenswrapper[3979]: I0319 09:19:53.142502 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 09:19:53.146155 master-0 kubenswrapper[3979]: I0319 09:19:53.145565 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 09:19:53.146155 master-0 kubenswrapper[3979]: I0319 09:19:53.145582 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 09:19:53.146155 master-0 kubenswrapper[3979]: I0319 09:19:53.145730 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 09:19:53.146155 master-0 kubenswrapper[3979]: I0319 09:19:53.145776 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:19:53.146561 master-0 kubenswrapper[3979]: I0319 09:19:53.146491 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:19:53.147356 master-0 kubenswrapper[3979]: I0319 09:19:53.147182 3979 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:19:53.148516 master-0 kubenswrapper[3979]: I0319 09:19:53.148463 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 09:19:53.148699 master-0 kubenswrapper[3979]: I0319 09:19:53.148674 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 09:19:53.149507 master-0 kubenswrapper[3979]: I0319 09:19:53.149472 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.149631 master-0 kubenswrapper[3979]: I0319 09:19:53.149515 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-config\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:19:53.149631 master-0 kubenswrapper[3979]: I0319 09:19:53.149563 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9blbc\" (UniqueName: \"kubernetes.io/projected/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-kube-api-access-9blbc\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:19:53.149631 master-0 kubenswrapper[3979]: I0319 09:19:53.149595 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rqsq\" (UniqueName: \"kubernetes.io/projected/3b50118d-f7c2-4bff-aca0-5c6623819baf-kube-api-access-6rqsq\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:19:53.149631 master-0 kubenswrapper[3979]: I0319 09:19:53.149625 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-serving-cert\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:19:53.149842 master-0 kubenswrapper[3979]: I0319 09:19:53.149653 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:19:53.149842 master-0 kubenswrapper[3979]: I0319 09:19:53.149727 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:19:53.149842 master-0 kubenswrapper[3979]: I0319 09:19:53.149773 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c75102-6790-4ed3-84da-61c3611186f8-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:19:53.149842 master-0 kubenswrapper[3979]: I0319 09:19:53.149799 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c247d991-809e-46b6-9617-9b05007b7560-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.149842 master-0 kubenswrapper[3979]: I0319 09:19:53.149837 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c75102-6790-4ed3-84da-61c3611186f8-config\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.149862 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b50118d-f7c2-4bff-aca0-5c6623819baf-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.149889 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3b50118d-f7c2-4bff-aca0-5c6623819baf-operand-assets\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.149912 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.149951 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vcf6\" (UniqueName: \"kubernetes.io/projected/9076d131-644a-4332-8a70-34f6b0f71575-kube-api-access-2vcf6\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.150699 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.150765 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.150803 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.150827 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4hqj\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-kube-api-access-v4hqj\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.150850 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9076d131-644a-4332-8a70-34f6b0f71575-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.150879 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0c75102-6790-4ed3-84da-61c3611186f8-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.156605 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.161461 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd"] Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.161515 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6"] Mar 19 09:19:53.163681 master-0 kubenswrapper[3979]: I0319 09:19:53.161553 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6"] Mar 19 09:19:53.164212 master-0 kubenswrapper[3979]: I0319 09:19:53.164106 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb"] Mar 19 09:19:53.164212 master-0 kubenswrapper[3979]: I0319 09:19:53.164129 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl"] Mar 19 09:19:53.166168 master-0 kubenswrapper[3979]: I0319 09:19:53.165791 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9"] Mar 19 09:19:53.170842 master-0 kubenswrapper[3979]: I0319 09:19:53.170784 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk"] Mar 19 09:19:53.170909 master-0 kubenswrapper[3979]: I0319 09:19:53.170852 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd"] Mar 19 09:19:53.170909 master-0 kubenswrapper[3979]: I0319 09:19:53.170866 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5"] Mar 19 09:19:53.175221 master-0 kubenswrapper[3979]: I0319 09:19:53.174560 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj"] Mar 19 09:19:53.175221 master-0 kubenswrapper[3979]: I0319 09:19:53.174618 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d"] Mar 19 09:19:53.175221 master-0 kubenswrapper[3979]: I0319 09:19:53.174634 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh"] Mar 19 09:19:53.182264 master-0 kubenswrapper[3979]: I0319 09:19:53.182219 3979 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-qfc76"] Mar 19 09:19:53.182692 master-0 kubenswrapper[3979]: I0319 09:19:53.182663 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-cbw4r"] Mar 19 09:19:53.182754 master-0 kubenswrapper[3979]: I0319 09:19:53.182708 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59"] Mar 19 09:19:53.182815 master-0 kubenswrapper[3979]: I0319 09:19:53.182759 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:19:53.184709 master-0 kubenswrapper[3979]: I0319 09:19:53.184674 3979 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.250343 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm"] Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251744 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0c75102-6790-4ed3-84da-61c3611186f8-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251786 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251816 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251836 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-config\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251853 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43cb2a3b-40e2-45ee-894a-6c833ee17efd-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251867 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251889 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9blbc\" (UniqueName: \"kubernetes.io/projected/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-kube-api-access-9blbc\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251910 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rqsq\" (UniqueName: \"kubernetes.io/projected/3b50118d-f7c2-4bff-aca0-5c6623819baf-kube-api-access-6rqsq\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251931 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251946 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251964 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b333a1e-2a7f-423a-8b40-99f30c89f740-config\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251980 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9plst\" (UniqueName: \"kubernetes.io/projected/8e073eb4-67f2-4de7-8848-50da73079dbc-kube-api-access-9plst\") pod \"csi-snapshot-controller-operator-5f5d689c6b-jv8lm\" (UID: \"8e073eb4-67f2-4de7-8848-50da73079dbc\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.251995 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-serving-cert\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:19:53.256097 master-0 kubenswrapper[3979]: I0319 09:19:53.252013 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252036 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d664acc4-ec4f-4078-ae93-404a14ea18fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252058 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252082 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7t5\" (UniqueName: \"kubernetes.io/projected/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-kube-api-access-vl7t5\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252121 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7fae040-28fa-4d97-8482-fd0dd12cc921-serving-cert\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252145 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c75102-6790-4ed3-84da-61c3611186f8-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252167 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c247d991-809e-46b6-9617-9b05007b7560-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252193 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6tp5\" (UniqueName: \"kubernetes.io/projected/9a6c1523-e77c-4aac-814c-05d41215c42f-kube-api-access-m6tp5\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252214 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252236 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f65184f-8fc2-4656-8776-a3b962aa1f5d-iptables-alerter-script\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252259 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hw6b\" (UniqueName: \"kubernetes.io/projected/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-kube-api-access-8hw6b\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252286 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3b50118d-f7c2-4bff-aca0-5c6623819baf-operand-assets\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252310 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d664acc4-ec4f-4078-ae93-404a14ea18fc-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252325 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d664acc4-ec4f-4078-ae93-404a14ea18fc-config\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:19:53.257083 master-0 kubenswrapper[3979]: I0319 09:19:53.252341 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d12dab-1215-4c1f-a9f5-27ea7174d308-trusted-ca\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252356 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-config\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252373 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-bound-sa-token\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252389 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5f5s\" (UniqueName: \"kubernetes.io/projected/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-kube-api-access-w5f5s\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252407 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252424 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252440 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252458 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf6dq\" (UniqueName: \"kubernetes.io/projected/43cb2a3b-40e2-45ee-894a-6c833ee17efd-kube-api-access-vf6dq\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252474 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252491 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-serving-cert\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252509 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8527f5cd-2992-44be-90b8-e9086cedf46e-config\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252546 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252572 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9076d131-644a-4332-8a70-34f6b0f71575-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252594 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v88k\" (UniqueName: \"kubernetes.io/projected/259794ab-d027-497a-b08e-5a6d79057668-kube-api-access-6v88k\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:19:53.257907 master-0 kubenswrapper[3979]: I0319 09:19:53.252617 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djxfs\" (UniqueName: \"kubernetes.io/projected/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-kube-api-access-djxfs\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252646 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252671 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252691 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/16d2930b-486b-492d-983e-c6702d8f53a7-kube-api-access-h5hk6\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252710 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cb2a3b-40e2-45ee-894a-6c833ee17efd-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252726 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252754 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-config\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252770 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252788 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252807 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252832 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgtgw\" (UniqueName: \"kubernetes.io/projected/8beda3a0-a653-4810-b3f2-d25badb21ab1-kube-api-access-tgtgw\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252855 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b333a1e-2a7f-423a-8b40-99f30c89f740-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252879 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bdnt\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-kube-api-access-6bdnt\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252909 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c75102-6790-4ed3-84da-61c3611186f8-config\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:19:53.258360 master-0 kubenswrapper[3979]: I0319 09:19:53.252926 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b50118d-f7c2-4bff-aca0-5c6623819baf-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:19:53.258857 master-0 kubenswrapper[3979]: I0319 09:19:53.252944 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8527f5cd-2992-44be-90b8-e9086cedf46e-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:19:53.258857 master-0 kubenswrapper[3979]: I0319 09:19:53.252958 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f65184f-8fc2-4656-8776-a3b962aa1f5d-host-slash\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:19:53.258857 master-0 kubenswrapper[3979]: I0319 09:19:53.252977 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvd6f\" (UniqueName: \"kubernetes.io/projected/3b333a1e-2a7f-423a-8b40-99f30c89f740-kube-api-access-xvd6f\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:19:53.258857 master-0 kubenswrapper[3979]: I0319 09:19:53.252995 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vcf6\" (UniqueName: \"kubernetes.io/projected/9076d131-644a-4332-8a70-34f6b0f71575-kube-api-access-2vcf6\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.258857 master-0 kubenswrapper[3979]: I0319 09:19:53.253011 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqwbw\" (UniqueName: \"kubernetes.io/projected/e7fae040-28fa-4d97-8482-fd0dd12cc921-kube-api-access-jqwbw\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.258857 master-0 kubenswrapper[3979]: I0319 09:19:53.253033 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp9jf\" (UniqueName: \"kubernetes.io/projected/8527f5cd-2992-44be-90b8-e9086cedf46e-kube-api-access-qp9jf\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:19:53.258857 master-0 kubenswrapper[3979]: I0319 09:19:53.253049 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.258857 master-0 kubenswrapper[3979]: I0319 09:19:53.253072 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j65pb\" (UniqueName: \"kubernetes.io/projected/4f65184f-8fc2-4656-8776-a3b962aa1f5d-kube-api-access-j65pb\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:19:53.258857 master-0 kubenswrapper[3979]: I0319 09:19:53.253107 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:19:53.258857 master-0 kubenswrapper[3979]: I0319 09:19:53.253125 3979 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-client\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.258857 master-0 kubenswrapper[3979]: I0319 09:19:53.253144 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4hqj\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-kube-api-access-v4hqj\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.264784 master-0 kubenswrapper[3979]: I0319 09:19:53.264671 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-gxznr"] Mar 19 09:19:53.265738 master-0 kubenswrapper[3979]: E0319 09:19:53.265696 3979 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:53.265785 master-0 kubenswrapper[3979]: I0319 09:19:53.265722 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd"] Mar 19 09:19:53.265814 master-0 kubenswrapper[3979]: E0319 09:19:53.265788 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls podName:c247d991-809e-46b6-9617-9b05007b7560 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:53.765756667 +0000 UTC m=+168.808744255 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5m8t6" (UID: "c247d991-809e-46b6-9617-9b05007b7560") : secret "image-registry-operator-tls" not found Mar 19 09:19:53.266034 master-0 kubenswrapper[3979]: E0319 09:19:53.265996 3979 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:53.266621 master-0 kubenswrapper[3979]: I0319 09:19:53.266551 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:19:53.267665 master-0 kubenswrapper[3979]: I0319 09:19:53.266729 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3b50118d-f7c2-4bff-aca0-5c6623819baf-operand-assets\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:19:53.267665 master-0 kubenswrapper[3979]: E0319 09:19:53.266963 3979 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:53.267665 master-0 kubenswrapper[3979]: E0319 09:19:53.267134 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:53.767107571 +0000 UTC m=+168.810095139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:53.271277 master-0 kubenswrapper[3979]: I0319 09:19:53.268315 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-config\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:19:53.271277 master-0 kubenswrapper[3979]: I0319 09:19:53.268384 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c75102-6790-4ed3-84da-61c3611186f8-config\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:19:53.271277 master-0 kubenswrapper[3979]: I0319 09:19:53.268570 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c247d991-809e-46b6-9617-9b05007b7560-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.271277 master-0 kubenswrapper[3979]: E0319 09:19:53.268741 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:53.768715912 +0000 UTC m=+168.811703490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "node-tuning-operator-tls" not found Mar 19 09:19:53.271277 master-0 kubenswrapper[3979]: I0319 09:19:53.269828 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692"] Mar 19 09:19:53.272567 master-0 kubenswrapper[3979]: I0319 09:19:53.272074 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9076d131-644a-4332-8a70-34f6b0f71575-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.272567 master-0 kubenswrapper[3979]: I0319 09:19:53.272118 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c75102-6790-4ed3-84da-61c3611186f8-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:19:53.273280 master-0 kubenswrapper[3979]: I0319 09:19:53.273232 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:19:53.274898 master-0 kubenswrapper[3979]: I0319 09:19:53.273861 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-serving-cert\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:19:53.275474 master-0 kubenswrapper[3979]: I0319 09:19:53.275224 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b50118d-f7c2-4bff-aca0-5c6623819baf-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:19:53.276216 master-0 kubenswrapper[3979]: I0319 09:19:53.276131 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898"] Mar 19 09:19:53.278302 master-0 kubenswrapper[3979]: I0319 09:19:53.278262 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9"] Mar 19 09:19:53.279562 master-0 kubenswrapper[3979]: I0319 09:19:53.279492 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7"] Mar 19 09:19:53.343801 master-0 kubenswrapper[3979]: I0319 09:19:53.343739 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=8.343717802 podStartE2EDuration="8.343717802s" podCreationTimestamp="2026-03-19 09:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:19:53.342400058 +0000 UTC m=+168.385387636" watchObservedRunningTime="2026-03-19 09:19:53.343717802 +0000 UTC m=+168.386705390" Mar 19 09:19:53.352586 master-0 kubenswrapper[3979]: I0319 09:19:53.352516 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4hqj\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-kube-api-access-v4hqj\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.353129 master-0 kubenswrapper[3979]: I0319 09:19:53.353076 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.354475 master-0 kubenswrapper[3979]: I0319 09:19:53.354433 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43cb2a3b-40e2-45ee-894a-6c833ee17efd-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:19:53.354475 master-0 kubenswrapper[3979]: I0319 09:19:53.354468 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.354611 master-0 kubenswrapper[3979]: I0319 09:19:53.354490 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9plst\" (UniqueName: \"kubernetes.io/projected/8e073eb4-67f2-4de7-8848-50da73079dbc-kube-api-access-9plst\") pod \"csi-snapshot-controller-operator-5f5d689c6b-jv8lm\" (UID: \"8e073eb4-67f2-4de7-8848-50da73079dbc\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm" Mar 19 09:19:53.354611 master-0 kubenswrapper[3979]: I0319 09:19:53.354518 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:19:53.354611 master-0 kubenswrapper[3979]: I0319 09:19:53.354553 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.354611 master-0 kubenswrapper[3979]: I0319 09:19:53.354570 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b333a1e-2a7f-423a-8b40-99f30c89f740-config\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:19:53.354611 master-0 kubenswrapper[3979]: I0319 09:19:53.354598 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d664acc4-ec4f-4078-ae93-404a14ea18fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:19:53.354796 master-0 kubenswrapper[3979]: I0319 09:19:53.354619 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:19:53.354796 master-0 kubenswrapper[3979]: I0319 09:19:53.354636 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7t5\" (UniqueName: \"kubernetes.io/projected/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-kube-api-access-vl7t5\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.354864 master-0 kubenswrapper[3979]: I0319 09:19:53.354815 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7fae040-28fa-4d97-8482-fd0dd12cc921-serving-cert\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.354864 master-0 kubenswrapper[3979]: I0319 09:19:53.354836 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6tp5\" (UniqueName: \"kubernetes.io/projected/9a6c1523-e77c-4aac-814c-05d41215c42f-kube-api-access-m6tp5\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:19:53.354864 master-0 kubenswrapper[3979]: I0319 09:19:53.354863 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f65184f-8fc2-4656-8776-a3b962aa1f5d-iptables-alerter-script\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:19:53.354979 master-0 kubenswrapper[3979]: I0319 09:19:53.354887 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hw6b\" (UniqueName: \"kubernetes.io/projected/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-kube-api-access-8hw6b\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:19:53.355218 master-0 kubenswrapper[3979]: I0319 09:19:53.355178 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d664acc4-ec4f-4078-ae93-404a14ea18fc-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:19:53.355218 master-0 kubenswrapper[3979]: I0319 09:19:53.355211 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d664acc4-ec4f-4078-ae93-404a14ea18fc-config\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:19:53.355318 master-0 kubenswrapper[3979]: I0319 09:19:53.355229 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d12dab-1215-4c1f-a9f5-27ea7174d308-trusted-ca\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.355318 master-0 kubenswrapper[3979]: I0319 09:19:53.355255 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-config\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.355318 master-0 kubenswrapper[3979]: I0319 09:19:53.355270 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-bound-sa-token\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.355318 master-0 kubenswrapper[3979]: I0319 09:19:53.355286 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5f5s\" (UniqueName: \"kubernetes.io/projected/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-kube-api-access-w5f5s\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:19:53.355318 master-0 kubenswrapper[3979]: I0319 09:19:53.355300 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.355318 master-0 kubenswrapper[3979]: I0319 09:19:53.355316 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6dq\" (UniqueName: \"kubernetes.io/projected/43cb2a3b-40e2-45ee-894a-6c833ee17efd-kube-api-access-vf6dq\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:19:53.355640 master-0 kubenswrapper[3979]: I0319 09:19:53.355333 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:19:53.355640 master-0 kubenswrapper[3979]: E0319 09:19:53.355483 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:53.355640 master-0 kubenswrapper[3979]: E0319 09:19:53.355545 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert podName:7b29cb7b-26d2-4fab-9e03-2d7fdf937592 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:53.855506543 +0000 UTC m=+168.898494121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert") pod "olm-operator-5c9796789-rh692" (UID: "7b29cb7b-26d2-4fab-9e03-2d7fdf937592") : secret "olm-operator-serving-cert" not found Mar 19 09:19:53.355762 master-0 kubenswrapper[3979]: I0319 09:19:53.355671 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-serving-cert\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.355762 master-0 kubenswrapper[3979]: I0319 09:19:53.355695 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8527f5cd-2992-44be-90b8-e9086cedf46e-config\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:19:53.355762 master-0 kubenswrapper[3979]: I0319 09:19:53.355713 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:19:53.355762 master-0 kubenswrapper[3979]: I0319 09:19:53.355734 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxfs\" (UniqueName: \"kubernetes.io/projected/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-kube-api-access-djxfs\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:19:53.355762 master-0 kubenswrapper[3979]: I0319 09:19:53.355752 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v88k\" (UniqueName: \"kubernetes.io/projected/259794ab-d027-497a-b08e-5a6d79057668-kube-api-access-6v88k\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:19:53.355951 master-0 kubenswrapper[3979]: I0319 09:19:53.355773 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.355951 master-0 kubenswrapper[3979]: I0319 09:19:53.355790 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:19:53.355951 master-0 kubenswrapper[3979]: I0319 09:19:53.355806 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/16d2930b-486b-492d-983e-c6702d8f53a7-kube-api-access-h5hk6\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:19:53.355951 master-0 kubenswrapper[3979]: I0319 09:19:53.355824 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cb2a3b-40e2-45ee-894a-6c833ee17efd-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:19:53.355951 master-0 kubenswrapper[3979]: I0319 09:19:53.355842 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.355951 master-0 kubenswrapper[3979]: I0319 09:19:53.355865 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:19:53.355951 master-0 kubenswrapper[3979]: I0319 09:19:53.355885 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:19:53.355951 master-0 kubenswrapper[3979]: I0319 09:19:53.355900 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f65184f-8fc2-4656-8776-a3b962aa1f5d-iptables-alerter-script\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:19:53.355951 master-0 kubenswrapper[3979]: I0319 09:19:53.355906 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b333a1e-2a7f-423a-8b40-99f30c89f740-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:19:53.356263 master-0 kubenswrapper[3979]: I0319 09:19:53.356016 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgtgw\" (UniqueName: \"kubernetes.io/projected/8beda3a0-a653-4810-b3f2-d25badb21ab1-kube-api-access-tgtgw\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:19:53.356263 master-0 kubenswrapper[3979]: I0319 09:19:53.356052 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bdnt\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-kube-api-access-6bdnt\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.356263 master-0 kubenswrapper[3979]: I0319 09:19:53.356112 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8527f5cd-2992-44be-90b8-e9086cedf46e-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:19:53.356263 master-0 kubenswrapper[3979]: I0319 09:19:53.356133 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f65184f-8fc2-4656-8776-a3b962aa1f5d-host-slash\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:19:53.356263 master-0 kubenswrapper[3979]: I0319 09:19:53.356179 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvd6f\" (UniqueName: \"kubernetes.io/projected/3b333a1e-2a7f-423a-8b40-99f30c89f740-kube-api-access-xvd6f\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:19:53.356263 master-0 kubenswrapper[3979]: I0319 09:19:53.356199 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqwbw\" (UniqueName: \"kubernetes.io/projected/e7fae040-28fa-4d97-8482-fd0dd12cc921-kube-api-access-jqwbw\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.356263 master-0 kubenswrapper[3979]: I0319 09:19:53.356213 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d664acc4-ec4f-4078-ae93-404a14ea18fc-config\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: E0319 09:19:53.357176 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: E0319 09:19:53.357239 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert podName:259794ab-d027-497a-b08e-5a6d79057668 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:53.857221097 +0000 UTC m=+168.900208675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert") pod "catalog-operator-68f85b4d6c-jg9m5" (UID: "259794ab-d027-497a-b08e-5a6d79057668") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: I0319 09:19:53.357702 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp9jf\" (UniqueName: \"kubernetes.io/projected/8527f5cd-2992-44be-90b8-e9086cedf46e-kube-api-access-qp9jf\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: I0319 09:19:53.357734 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j65pb\" (UniqueName: \"kubernetes.io/projected/4f65184f-8fc2-4656-8776-a3b962aa1f5d-kube-api-access-j65pb\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: I0319 09:19:53.357756 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: I0319 09:19:53.357771 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8527f5cd-2992-44be-90b8-e9086cedf46e-config\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: E0319 09:19:53.357810 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: E0319 09:19:53.357946 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert podName:9a6c1523-e77c-4aac-814c-05d41215c42f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:53.857894004 +0000 UTC m=+168.900881662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-5jsnd" (UID: "9a6c1523-e77c-4aac-814c-05d41215c42f") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: E0319 09:19:53.357963 3979 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: E0319 09:19:53.357997 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls podName:8c8ee765-76b8-4cde-8acb-6e5edd1b8149 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:53.857986346 +0000 UTC m=+168.900973924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-rtzvj" (UID: "8c8ee765-76b8-4cde-8acb-6e5edd1b8149") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: I0319 09:19:53.357776 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-client\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: I0319 09:19:53.358079 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: I0319 09:19:53.358146 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: I0319 09:19:53.358195 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-config\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.358517 master-0 kubenswrapper[3979]: E0319 09:19:53.358250 3979 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:53.359477 master-0 kubenswrapper[3979]: E0319 09:19:53.358300 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics podName:dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e nodeName:}" failed. No retries permitted until 2026-03-19 09:19:53.858284204 +0000 UTC m=+168.901271782 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-gxznr" (UID: "dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e") : secret "marketplace-operator-metrics" not found Mar 19 09:19:53.359477 master-0 kubenswrapper[3979]: I0319 09:19:53.359176 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43cb2a3b-40e2-45ee-894a-6c833ee17efd-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:19:53.359477 master-0 kubenswrapper[3979]: I0319 09:19:53.359226 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-config\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.359477 master-0 kubenswrapper[3979]: E0319 09:19:53.359284 3979 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:53.359477 master-0 kubenswrapper[3979]: E0319 09:19:53.359292 3979 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:53.359477 master-0 kubenswrapper[3979]: E0319 09:19:53.359318 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs podName:8beda3a0-a653-4810-b3f2-d25badb21ab1 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:53.85930612 +0000 UTC m=+168.902293698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-fvh8d" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1") : secret "multus-admission-controller-secret" not found Mar 19 09:19:53.359477 master-0 kubenswrapper[3979]: E0319 09:19:53.359378 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls podName:03d12dab-1215-4c1f-a9f5-27ea7174d308 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:53.859357132 +0000 UTC m=+168.902344940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls") pod "ingress-operator-66b84d69b-rvwfh" (UID: "03d12dab-1215-4c1f-a9f5-27ea7174d308") : secret "metrics-tls" not found Mar 19 09:19:53.359477 master-0 kubenswrapper[3979]: I0319 09:19:53.359435 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f65184f-8fc2-4656-8776-a3b962aa1f5d-host-slash\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:19:53.360181 master-0 kubenswrapper[3979]: I0319 09:19:53.359765 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-config\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.360181 master-0 kubenswrapper[3979]: I0319 09:19:53.360022 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.360181 master-0 kubenswrapper[3979]: I0319 09:19:53.360069 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b333a1e-2a7f-423a-8b40-99f30c89f740-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:19:53.360365 master-0 kubenswrapper[3979]: I0319 09:19:53.360202 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8527f5cd-2992-44be-90b8-e9086cedf46e-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:19:53.360365 master-0 kubenswrapper[3979]: E0319 09:19:53.360272 3979 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:53.360365 master-0 kubenswrapper[3979]: E0319 09:19:53.360298 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls podName:16d2930b-486b-492d-983e-c6702d8f53a7 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:53.860289726 +0000 UTC m=+168.903277304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls") pod "dns-operator-9c5679d8f-cbw4r" (UID: "16d2930b-486b-492d-983e-c6702d8f53a7") : secret "metrics-tls" not found Mar 19 09:19:53.360365 master-0 kubenswrapper[3979]: I0319 09:19:53.360314 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d664acc4-ec4f-4078-ae93-404a14ea18fc-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:19:53.360365 master-0 kubenswrapper[3979]: I0319 09:19:53.360329 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.361116 master-0 kubenswrapper[3979]: I0319 09:19:53.360430 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b333a1e-2a7f-423a-8b40-99f30c89f740-config\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:19:53.361116 master-0 kubenswrapper[3979]: I0319 09:19:53.360564 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:19:53.361116 master-0 kubenswrapper[3979]: I0319 09:19:53.360613 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cb2a3b-40e2-45ee-894a-6c833ee17efd-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:19:53.361116 master-0 kubenswrapper[3979]: I0319 09:19:53.360728 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:19:53.361116 master-0 kubenswrapper[3979]: I0319 09:19:53.361032 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.362897 master-0 kubenswrapper[3979]: I0319 09:19:53.362852 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7fae040-28fa-4d97-8482-fd0dd12cc921-serving-cert\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.363286 master-0 kubenswrapper[3979]: I0319 09:19:53.363242 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-client\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.363428 master-0 kubenswrapper[3979]: I0319 09:19:53.363395 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-serving-cert\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.365103 master-0 kubenswrapper[3979]: I0319 09:19:53.365041 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d12dab-1215-4c1f-a9f5-27ea7174d308-trusted-ca\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.400549 master-0 kubenswrapper[3979]: I0319 09:19:53.396901 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:19:53.400549 master-0 kubenswrapper[3979]: I0319 09:19:53.399761 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vcf6\" (UniqueName: \"kubernetes.io/projected/9076d131-644a-4332-8a70-34f6b0f71575-kube-api-access-2vcf6\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.408872 master-0 kubenswrapper[3979]: I0319 09:19:53.401844 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9blbc\" (UniqueName: \"kubernetes.io/projected/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-kube-api-access-9blbc\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:19:53.408872 master-0 kubenswrapper[3979]: I0319 09:19:53.402924 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqwbw\" (UniqueName: \"kubernetes.io/projected/e7fae040-28fa-4d97-8482-fd0dd12cc921-kube-api-access-jqwbw\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.408872 master-0 kubenswrapper[3979]: I0319 09:19:53.405433 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxfs\" (UniqueName: \"kubernetes.io/projected/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-kube-api-access-djxfs\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:19:53.408872 master-0 kubenswrapper[3979]: I0319 09:19:53.406472 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d664acc4-ec4f-4078-ae93-404a14ea18fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:19:53.409231 master-0 kubenswrapper[3979]: I0319 09:19:53.409187 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v88k\" (UniqueName: \"kubernetes.io/projected/259794ab-d027-497a-b08e-5a6d79057668-kube-api-access-6v88k\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:19:53.413196 master-0 kubenswrapper[3979]: I0319 09:19:53.409568 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp9jf\" (UniqueName: \"kubernetes.io/projected/8527f5cd-2992-44be-90b8-e9086cedf46e-kube-api-access-qp9jf\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:19:53.413196 master-0 kubenswrapper[3979]: I0319 09:19:53.409571 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvd6f\" (UniqueName: \"kubernetes.io/projected/3b333a1e-2a7f-423a-8b40-99f30c89f740-kube-api-access-xvd6f\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:19:53.413196 master-0 kubenswrapper[3979]: I0319 09:19:53.411414 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7t5\" (UniqueName: \"kubernetes.io/projected/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-kube-api-access-vl7t5\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.413196 master-0 kubenswrapper[3979]: I0319 09:19:53.413434 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rqsq\" (UniqueName: \"kubernetes.io/projected/3b50118d-f7c2-4bff-aca0-5c6623819baf-kube-api-access-6rqsq\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:19:53.413947 master-0 kubenswrapper[3979]: I0319 09:19:53.413926 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hw6b\" (UniqueName: \"kubernetes.io/projected/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-kube-api-access-8hw6b\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:19:53.414032 master-0 kubenswrapper[3979]: I0319 09:19:53.413959 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgtgw\" (UniqueName: \"kubernetes.io/projected/8beda3a0-a653-4810-b3f2-d25badb21ab1-kube-api-access-tgtgw\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:19:53.415622 master-0 kubenswrapper[3979]: I0319 09:19:53.414165 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6dq\" (UniqueName: \"kubernetes.io/projected/43cb2a3b-40e2-45ee-894a-6c833ee17efd-kube-api-access-vf6dq\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:19:53.415622 master-0 kubenswrapper[3979]: I0319 09:19:53.414482 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0c75102-6790-4ed3-84da-61c3611186f8-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:19:53.415622 master-0 kubenswrapper[3979]: I0319 09:19:53.414599 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j65pb\" (UniqueName: \"kubernetes.io/projected/4f65184f-8fc2-4656-8776-a3b962aa1f5d-kube-api-access-j65pb\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:19:53.415622 master-0 kubenswrapper[3979]: I0319 09:19:53.414674 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5f5s\" (UniqueName: \"kubernetes.io/projected/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-kube-api-access-w5f5s\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:19:53.415622 master-0 kubenswrapper[3979]: I0319 09:19:53.414793 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6tp5\" (UniqueName: \"kubernetes.io/projected/9a6c1523-e77c-4aac-814c-05d41215c42f-kube-api-access-m6tp5\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:19:53.416085 master-0 kubenswrapper[3979]: I0319 09:19:53.415983 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-bound-sa-token\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.416480 master-0 kubenswrapper[3979]: I0319 09:19:53.416251 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bdnt\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-kube-api-access-6bdnt\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.416777 master-0 kubenswrapper[3979]: I0319 09:19:53.416745 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/16d2930b-486b-492d-983e-c6702d8f53a7-kube-api-access-h5hk6\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:19:53.418409 master-0 kubenswrapper[3979]: I0319 09:19:53.418369 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9plst\" (UniqueName: \"kubernetes.io/projected/8e073eb4-67f2-4de7-8848-50da73079dbc-kube-api-access-9plst\") pod \"csi-snapshot-controller-operator-5f5d689c6b-jv8lm\" (UID: \"8e073eb4-67f2-4de7-8848-50da73079dbc\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm" Mar 19 09:19:53.424188 master-0 kubenswrapper[3979]: I0319 09:19:53.424147 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:19:53.448879 master-0 kubenswrapper[3979]: I0319 09:19:53.448811 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:19:53.471397 master-0 kubenswrapper[3979]: I0319 09:19:53.471136 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:19:53.492149 master-0 kubenswrapper[3979]: I0319 09:19:53.492096 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:19:53.502872 master-0 kubenswrapper[3979]: I0319 09:19:53.502821 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:19:53.515029 master-0 kubenswrapper[3979]: I0319 09:19:53.514799 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:19:53.529324 master-0 kubenswrapper[3979]: I0319 09:19:53.529265 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:19:53.537354 master-0 kubenswrapper[3979]: I0319 09:19:53.537299 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:19:53.589793 master-0 kubenswrapper[3979]: I0319 09:19:53.589748 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:19:53.613670 master-0 kubenswrapper[3979]: I0319 09:19:53.609817 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:19:53.639758 master-0 kubenswrapper[3979]: I0319 09:19:53.639714 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm" Mar 19 09:19:53.673653 master-0 kubenswrapper[3979]: I0319 09:19:53.670119 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:19:53.776648 master-0 kubenswrapper[3979]: I0319 09:19:53.776129 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.776789 master-0 kubenswrapper[3979]: I0319 09:19:53.776714 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:53.776789 master-0 kubenswrapper[3979]: I0319 09:19:53.776758 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:53.776954 master-0 kubenswrapper[3979]: E0319 09:19:53.776923 3979 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:53.777005 master-0 kubenswrapper[3979]: E0319 09:19:53.776987 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls podName:c247d991-809e-46b6-9617-9b05007b7560 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:54.776967855 +0000 UTC m=+169.819955433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5m8t6" (UID: "c247d991-809e-46b6-9617-9b05007b7560") : secret "image-registry-operator-tls" not found Mar 19 09:19:53.777050 master-0 kubenswrapper[3979]: E0319 09:19:53.777011 3979 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:53.777119 master-0 kubenswrapper[3979]: E0319 09:19:53.777101 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:54.777078448 +0000 UTC m=+169.820066086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "node-tuning-operator-tls" not found Mar 19 09:19:53.777317 master-0 kubenswrapper[3979]: E0319 09:19:53.777284 3979 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:53.777363 master-0 kubenswrapper[3979]: E0319 09:19:53.777336 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:54.777322385 +0000 UTC m=+169.820310043 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:53.878020 master-0 kubenswrapper[3979]: I0319 09:19:53.877956 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: E0319 09:19:53.878105 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: E0319 09:19:53.878176 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert podName:259794ab-d027-497a-b08e-5a6d79057668 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:54.878158274 +0000 UTC m=+169.921145852 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert") pod "catalog-operator-68f85b4d6c-jg9m5" (UID: "259794ab-d027-497a-b08e-5a6d79057668") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: I0319 09:19:53.878227 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: I0319 09:19:53.878265 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: I0319 09:19:53.878343 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: I0319 09:19:53.878371 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: I0319 09:19:53.878400 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: I0319 09:19:53.878416 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: I0319 09:19:53.878490 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: E0319 09:19:53.878570 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: E0319 09:19:53.878617 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert podName:7b29cb7b-26d2-4fab-9e03-2d7fdf937592 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:54.878610026 +0000 UTC m=+169.921597614 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert") pod "olm-operator-5c9796789-rh692" (UID: "7b29cb7b-26d2-4fab-9e03-2d7fdf937592") : secret "olm-operator-serving-cert" not found Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: E0319 09:19:53.878760 3979 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: E0319 09:19:53.878760 3979 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: E0319 09:19:53.878785 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics podName:dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e nodeName:}" failed. No retries permitted until 2026-03-19 09:19:54.87877552 +0000 UTC m=+169.921763098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-gxznr" (UID: "dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e") : secret "marketplace-operator-metrics" not found Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: E0319 09:19:53.878818 3979 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:53.878842 master-0 kubenswrapper[3979]: E0319 09:19:53.878832 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls podName:8c8ee765-76b8-4cde-8acb-6e5edd1b8149 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:54.878813531 +0000 UTC m=+169.921801099 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-rtzvj" (UID: "8c8ee765-76b8-4cde-8acb-6e5edd1b8149") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:53.879371 master-0 kubenswrapper[3979]: E0319 09:19:53.878850 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs podName:8beda3a0-a653-4810-b3f2-d25badb21ab1 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:54.878842611 +0000 UTC m=+169.921830199 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-fvh8d" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1") : secret "multus-admission-controller-secret" not found Mar 19 09:19:53.879371 master-0 kubenswrapper[3979]: E0319 09:19:53.878900 3979 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:53.879371 master-0 kubenswrapper[3979]: E0319 09:19:53.878918 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls podName:16d2930b-486b-492d-983e-c6702d8f53a7 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:54.878913223 +0000 UTC m=+169.921900801 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls") pod "dns-operator-9c5679d8f-cbw4r" (UID: "16d2930b-486b-492d-983e-c6702d8f53a7") : secret "metrics-tls" not found Mar 19 09:19:53.879371 master-0 kubenswrapper[3979]: E0319 09:19:53.878953 3979 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:53.879371 master-0 kubenswrapper[3979]: E0319 09:19:53.878972 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls podName:03d12dab-1215-4c1f-a9f5-27ea7174d308 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:54.878965504 +0000 UTC m=+169.921953082 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls") pod "ingress-operator-66b84d69b-rvwfh" (UID: "03d12dab-1215-4c1f-a9f5-27ea7174d308") : secret "metrics-tls" not found Mar 19 09:19:53.879371 master-0 kubenswrapper[3979]: E0319 09:19:53.879003 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:53.879371 master-0 kubenswrapper[3979]: E0319 09:19:53.879020 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert podName:9a6c1523-e77c-4aac-814c-05d41215c42f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:54.879015236 +0000 UTC m=+169.922002814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-5jsnd" (UID: "9a6c1523-e77c-4aac-814c-05d41215c42f") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:53.940320 master-0 kubenswrapper[3979]: I0319 09:19:53.939980 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd"] Mar 19 09:19:53.940320 master-0 kubenswrapper[3979]: I0319 09:19:53.940030 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9"] Mar 19 09:19:53.942150 master-0 kubenswrapper[3979]: I0319 09:19:53.942063 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl"] Mar 19 09:19:53.943253 master-0 kubenswrapper[3979]: I0319 09:19:53.943214 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk"] Mar 19 09:19:53.944547 master-0 kubenswrapper[3979]: I0319 09:19:53.943633 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59"] Mar 19 09:19:53.944942 master-0 kubenswrapper[3979]: I0319 09:19:53.944645 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd"] Mar 19 09:19:53.948790 master-0 kubenswrapper[3979]: I0319 09:19:53.945553 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898"] Mar 19 09:19:53.948790 master-0 kubenswrapper[3979]: W0319 09:19:53.948681 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8527f5cd_2992_44be_90b8_e9086cedf46e.slice/crio-e3a470e3bacc4ee90522d655c1cb49f2266b41a208ae2967afd423c830e462e3 WatchSource:0}: Error finding container e3a470e3bacc4ee90522d655c1cb49f2266b41a208ae2967afd423c830e462e3: Status 404 returned error can't find the container with id e3a470e3bacc4ee90522d655c1cb49f2266b41a208ae2967afd423c830e462e3 Mar 19 09:19:53.969072 master-0 kubenswrapper[3979]: W0319 09:19:53.968745 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b333a1e_2a7f_423a_8b40_99f30c89f740.slice/crio-944648f39111fd7c1a6ed081666cf0303ca2a6eb595623e82619c7478d3372ab WatchSource:0}: Error finding container 944648f39111fd7c1a6ed081666cf0303ca2a6eb595623e82619c7478d3372ab: Status 404 returned error can't find the container with id 944648f39111fd7c1a6ed081666cf0303ca2a6eb595623e82619c7478d3372ab Mar 19 09:19:53.969493 master-0 kubenswrapper[3979]: W0319 09:19:53.969451 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode03f97d1_b6fe_4fc9_8cb5_c97af7a651bb.slice/crio-0ee32bb670dc76513805d1b62d5fffdb198c07008f45dbefa73a8b74cfb40229 WatchSource:0}: Error finding container 0ee32bb670dc76513805d1b62d5fffdb198c07008f45dbefa73a8b74cfb40229: Status 404 returned error can't find the container with id 0ee32bb670dc76513805d1b62d5fffdb198c07008f45dbefa73a8b74cfb40229 Mar 19 09:19:54.025290 master-0 kubenswrapper[3979]: I0319 09:19:54.024103 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7"] Mar 19 09:19:54.025290 master-0 kubenswrapper[3979]: I0319 09:19:54.024305 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9"] Mar 19 09:19:54.025290 master-0 kubenswrapper[3979]: I0319 09:19:54.025044 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6"] Mar 19 09:19:54.025418 master-0 kubenswrapper[3979]: I0319 09:19:54.025356 3979 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm"] Mar 19 09:19:54.028716 master-0 kubenswrapper[3979]: W0319 09:19:54.028680 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0c75102_6790_4ed3_84da_61c3611186f8.slice/crio-c8885dfa43b9e4c2a58db6e5ff12c1dfdfe9193837daeb55173993661ea9f46a WatchSource:0}: Error finding container c8885dfa43b9e4c2a58db6e5ff12c1dfdfe9193837daeb55173993661ea9f46a: Status 404 returned error can't find the container with id c8885dfa43b9e4c2a58db6e5ff12c1dfdfe9193837daeb55173993661ea9f46a Mar 19 09:19:54.029555 master-0 kubenswrapper[3979]: W0319 09:19:54.029508 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43cb2a3b_40e2_45ee_894a_6c833ee17efd.slice/crio-a68ad4116cab88705ddf2fb479c6fa07f6cc567a78a2d33208b00017ebb5225f WatchSource:0}: Error finding container a68ad4116cab88705ddf2fb479c6fa07f6cc567a78a2d33208b00017ebb5225f: Status 404 returned error can't find the container with id a68ad4116cab88705ddf2fb479c6fa07f6cc567a78a2d33208b00017ebb5225f Mar 19 09:19:54.032594 master-0 kubenswrapper[3979]: W0319 09:19:54.032518 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e073eb4_67f2_4de7_8848_50da73079dbc.slice/crio-0865b3dd8e414cf36fa73f2f26a1125029b6401943086385aaef6e6adbd387e7 WatchSource:0}: Error finding container 0865b3dd8e414cf36fa73f2f26a1125029b6401943086385aaef6e6adbd387e7: Status 404 returned error can't find the container with id 0865b3dd8e414cf36fa73f2f26a1125029b6401943086385aaef6e6adbd387e7 Mar 19 09:19:54.033715 master-0 kubenswrapper[3979]: W0319 09:19:54.033669 3979 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd664acc4_ec4f_4078_ae93_404a14ea18fc.slice/crio-1c57ea8e09d1325e300d649233ce1b17315dc34efabdbd2fd35d3a0b5c00a757 WatchSource:0}: Error finding container 1c57ea8e09d1325e300d649233ce1b17315dc34efabdbd2fd35d3a0b5c00a757: Status 404 returned error can't find the container with id 1c57ea8e09d1325e300d649233ce1b17315dc34efabdbd2fd35d3a0b5c00a757 Mar 19 09:19:54.465959 master-0 kubenswrapper[3979]: I0319 09:19:54.465786 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" event={"ID":"3b333a1e-2a7f-423a-8b40-99f30c89f740","Type":"ContainerStarted","Data":"944648f39111fd7c1a6ed081666cf0303ca2a6eb595623e82619c7478d3372ab"} Mar 19 09:19:54.467480 master-0 kubenswrapper[3979]: I0319 09:19:54.467144 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" event={"ID":"3b50118d-f7c2-4bff-aca0-5c6623819baf","Type":"ContainerStarted","Data":"1885558dee49f6f6ad4666eff4afb57c213620724cc5285f30bbd5409ae9582e"} Mar 19 09:19:54.468008 master-0 kubenswrapper[3979]: I0319 09:19:54.467978 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" event={"ID":"f0c75102-6790-4ed3-84da-61c3611186f8","Type":"ContainerStarted","Data":"c8885dfa43b9e4c2a58db6e5ff12c1dfdfe9193837daeb55173993661ea9f46a"} Mar 19 09:19:54.469206 master-0 kubenswrapper[3979]: I0319 09:19:54.469166 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm" event={"ID":"8e073eb4-67f2-4de7-8848-50da73079dbc","Type":"ContainerStarted","Data":"0865b3dd8e414cf36fa73f2f26a1125029b6401943086385aaef6e6adbd387e7"} Mar 19 09:19:54.470290 master-0 kubenswrapper[3979]: I0319 09:19:54.470230 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" event={"ID":"43cb2a3b-40e2-45ee-894a-6c833ee17efd","Type":"ContainerStarted","Data":"a68ad4116cab88705ddf2fb479c6fa07f6cc567a78a2d33208b00017ebb5225f"} Mar 19 09:19:54.471324 master-0 kubenswrapper[3979]: I0319 09:19:54.471281 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" event={"ID":"8527f5cd-2992-44be-90b8-e9086cedf46e","Type":"ContainerStarted","Data":"e3a470e3bacc4ee90522d655c1cb49f2266b41a208ae2967afd423c830e462e3"} Mar 19 09:19:54.473155 master-0 kubenswrapper[3979]: I0319 09:19:54.473114 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" event={"ID":"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb","Type":"ContainerStarted","Data":"2c0b681fce22722dc6bda98cf745e2b79d2558bec9534ca23b3f5d2d7fcdef7a"} Mar 19 09:19:54.473219 master-0 kubenswrapper[3979]: I0319 09:19:54.473158 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" event={"ID":"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb","Type":"ContainerStarted","Data":"0ee32bb670dc76513805d1b62d5fffdb198c07008f45dbefa73a8b74cfb40229"} Mar 19 09:19:54.474106 master-0 kubenswrapper[3979]: I0319 09:19:54.473985 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qfc76" event={"ID":"4f65184f-8fc2-4656-8776-a3b962aa1f5d","Type":"ContainerStarted","Data":"72c2a8691354c4c557c4f66cfa7db93075f810bfaffc8fba5e2d6aab857f58a8"} Mar 19 09:19:54.475398 master-0 kubenswrapper[3979]: I0319 09:19:54.475320 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" event={"ID":"a1098584-43b9-4f2c-83d2-22d95fb7b0c3","Type":"ContainerStarted","Data":"310348963a49f41d34871a4c0d732a2191aaea2d2db0ebbe19d1390098835ced"} Mar 19 09:19:54.476468 master-0 kubenswrapper[3979]: I0319 09:19:54.476314 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" event={"ID":"e7fae040-28fa-4d97-8482-fd0dd12cc921","Type":"ContainerStarted","Data":"963f71e764d046880085fa5f09ddf4d6f88636354e79d8ab2e64d52ec74b74ae"} Mar 19 09:19:54.478934 master-0 kubenswrapper[3979]: I0319 09:19:54.478897 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" event={"ID":"d664acc4-ec4f-4078-ae93-404a14ea18fc","Type":"ContainerStarted","Data":"1c57ea8e09d1325e300d649233ce1b17315dc34efabdbd2fd35d3a0b5c00a757"} Mar 19 09:19:54.480136 master-0 kubenswrapper[3979]: I0319 09:19:54.480071 3979 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" event={"ID":"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4","Type":"ContainerStarted","Data":"58a6496fefda9dc10f2cd3d711f675ec3a41cf0c8719af9244e86cc4f0694683"} Mar 19 09:19:54.786941 master-0 kubenswrapper[3979]: I0319 09:19:54.786886 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:54.787108 master-0 kubenswrapper[3979]: I0319 09:19:54.786958 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:54.787108 master-0 kubenswrapper[3979]: I0319 09:19:54.786978 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:54.787108 master-0 kubenswrapper[3979]: E0319 09:19:54.787094 3979 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:54.787208 master-0 kubenswrapper[3979]: E0319 09:19:54.787138 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls podName:c247d991-809e-46b6-9617-9b05007b7560 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:56.787124488 +0000 UTC m=+171.830112066 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5m8t6" (UID: "c247d991-809e-46b6-9617-9b05007b7560") : secret "image-registry-operator-tls" not found Mar 19 09:19:54.787208 master-0 kubenswrapper[3979]: E0319 09:19:54.787179 3979 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:54.787208 master-0 kubenswrapper[3979]: E0319 09:19:54.787195 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:56.78718968 +0000 UTC m=+171.830177258 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "node-tuning-operator-tls" not found Mar 19 09:19:54.787361 master-0 kubenswrapper[3979]: E0319 09:19:54.787293 3979 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:54.787417 master-0 kubenswrapper[3979]: E0319 09:19:54.787393 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:56.787375094 +0000 UTC m=+171.830362672 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:54.889244 master-0 kubenswrapper[3979]: I0319 09:19:54.889170 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:19:54.889244 master-0 kubenswrapper[3979]: I0319 09:19:54.889246 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:19:54.889959 master-0 kubenswrapper[3979]: I0319 09:19:54.889271 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:19:54.889959 master-0 kubenswrapper[3979]: E0319 09:19:54.889438 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:54.889959 master-0 kubenswrapper[3979]: I0319 09:19:54.889484 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:19:54.889959 master-0 kubenswrapper[3979]: E0319 09:19:54.889508 3979 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:54.889959 master-0 kubenswrapper[3979]: E0319 09:19:54.889519 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert podName:259794ab-d027-497a-b08e-5a6d79057668 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:56.889500577 +0000 UTC m=+171.932488155 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert") pod "catalog-operator-68f85b4d6c-jg9m5" (UID: "259794ab-d027-497a-b08e-5a6d79057668") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:54.889959 master-0 kubenswrapper[3979]: E0319 09:19:54.889674 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:54.889959 master-0 kubenswrapper[3979]: E0319 09:19:54.889749 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert podName:9a6c1523-e77c-4aac-814c-05d41215c42f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:56.889734703 +0000 UTC m=+171.932722281 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-5jsnd" (UID: "9a6c1523-e77c-4aac-814c-05d41215c42f") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:54.889959 master-0 kubenswrapper[3979]: E0319 09:19:54.889437 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:54.889959 master-0 kubenswrapper[3979]: E0319 09:19:54.889854 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert podName:7b29cb7b-26d2-4fab-9e03-2d7fdf937592 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:56.889822095 +0000 UTC m=+171.932809673 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert") pod "olm-operator-5c9796789-rh692" (UID: "7b29cb7b-26d2-4fab-9e03-2d7fdf937592") : secret "olm-operator-serving-cert" not found Mar 19 09:19:54.889959 master-0 kubenswrapper[3979]: E0319 09:19:54.889878 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls podName:16d2930b-486b-492d-983e-c6702d8f53a7 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:56.889868776 +0000 UTC m=+171.932856584 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls") pod "dns-operator-9c5679d8f-cbw4r" (UID: "16d2930b-486b-492d-983e-c6702d8f53a7") : secret "metrics-tls" not found Mar 19 09:19:54.889959 master-0 kubenswrapper[3979]: I0319 09:19:54.889905 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:19:54.890371 master-0 kubenswrapper[3979]: I0319 09:19:54.889962 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:19:54.890371 master-0 kubenswrapper[3979]: I0319 09:19:54.890266 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:19:54.890371 master-0 kubenswrapper[3979]: E0319 09:19:54.890036 3979 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:54.890371 master-0 kubenswrapper[3979]: I0319 09:19:54.890303 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:54.890371 master-0 kubenswrapper[3979]: E0319 09:19:54.890328 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls podName:8c8ee765-76b8-4cde-8acb-6e5edd1b8149 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:56.890317538 +0000 UTC m=+171.933305116 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-rtzvj" (UID: "8c8ee765-76b8-4cde-8acb-6e5edd1b8149") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:54.890371 master-0 kubenswrapper[3979]: E0319 09:19:54.890042 3979 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:54.890818 master-0 kubenswrapper[3979]: E0319 09:19:54.890411 3979 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:54.890818 master-0 kubenswrapper[3979]: E0319 09:19:54.890474 3979 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:54.890906 master-0 kubenswrapper[3979]: E0319 09:19:54.890414 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics podName:dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e nodeName:}" failed. No retries permitted until 2026-03-19 09:19:56.89040504 +0000 UTC m=+171.933392618 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-gxznr" (UID: "dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e") : secret "marketplace-operator-metrics" not found Mar 19 09:19:54.890958 master-0 kubenswrapper[3979]: E0319 09:19:54.890924 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs podName:8beda3a0-a653-4810-b3f2-d25badb21ab1 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:56.890910983 +0000 UTC m=+171.933898561 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-fvh8d" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1") : secret "multus-admission-controller-secret" not found Mar 19 09:19:54.890958 master-0 kubenswrapper[3979]: E0319 09:19:54.890947 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls podName:03d12dab-1215-4c1f-a9f5-27ea7174d308 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:56.890939414 +0000 UTC m=+171.933926992 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls") pod "ingress-operator-66b84d69b-rvwfh" (UID: "03d12dab-1215-4c1f-a9f5-27ea7174d308") : secret "metrics-tls" not found Mar 19 09:19:56.819086 master-0 kubenswrapper[3979]: I0319 09:19:56.819003 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:56.819086 master-0 kubenswrapper[3979]: I0319 09:19:56.819077 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:19:56.819924 master-0 kubenswrapper[3979]: E0319 09:19:56.819263 3979 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:56.819924 master-0 kubenswrapper[3979]: I0319 09:19:56.819440 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:19:56.819924 master-0 kubenswrapper[3979]: E0319 09:19:56.819504 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:00.819485541 +0000 UTC m=+175.862473119 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "node-tuning-operator-tls" not found Mar 19 09:19:56.819924 master-0 kubenswrapper[3979]: E0319 09:19:56.819661 3979 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:56.819924 master-0 kubenswrapper[3979]: E0319 09:19:56.819738 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls podName:c247d991-809e-46b6-9617-9b05007b7560 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:00.819717068 +0000 UTC m=+175.862704686 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5m8t6" (UID: "c247d991-809e-46b6-9617-9b05007b7560") : secret "image-registry-operator-tls" not found Mar 19 09:19:56.819924 master-0 kubenswrapper[3979]: E0319 09:19:56.819748 3979 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:56.819924 master-0 kubenswrapper[3979]: E0319 09:19:56.819784 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:00.819773649 +0000 UTC m=+175.862761327 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:56.920842 master-0 kubenswrapper[3979]: I0319 09:19:56.920717 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:19:56.920842 master-0 kubenswrapper[3979]: I0319 09:19:56.920815 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:19:56.920842 master-0 kubenswrapper[3979]: I0319 09:19:56.920843 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:19:56.920842 master-0 kubenswrapper[3979]: E0319 09:19:56.920857 3979 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:56.921315 master-0 kubenswrapper[3979]: E0319 09:19:56.920923 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls podName:8c8ee765-76b8-4cde-8acb-6e5edd1b8149 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:00.920904196 +0000 UTC m=+175.963891784 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-rtzvj" (UID: "8c8ee765-76b8-4cde-8acb-6e5edd1b8149") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:56.921315 master-0 kubenswrapper[3979]: E0319 09:19:56.920924 3979 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:56.921315 master-0 kubenswrapper[3979]: I0319 09:19:56.920864 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:19:56.921315 master-0 kubenswrapper[3979]: E0319 09:19:56.920956 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls podName:03d12dab-1215-4c1f-a9f5-27ea7174d308 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:00.920948267 +0000 UTC m=+175.963935855 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls") pod "ingress-operator-66b84d69b-rvwfh" (UID: "03d12dab-1215-4c1f-a9f5-27ea7174d308") : secret "metrics-tls" not found Mar 19 09:19:56.921315 master-0 kubenswrapper[3979]: E0319 09:19:56.920966 3979 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:56.921315 master-0 kubenswrapper[3979]: E0319 09:19:56.920983 3979 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:56.921315 master-0 kubenswrapper[3979]: E0319 09:19:56.921014 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs podName:8beda3a0-a653-4810-b3f2-d25badb21ab1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:00.920997119 +0000 UTC m=+175.963984707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-fvh8d" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1") : secret "multus-admission-controller-secret" not found Mar 19 09:19:56.921315 master-0 kubenswrapper[3979]: E0319 09:19:56.921046 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics podName:dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:00.921026209 +0000 UTC m=+175.964013797 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-gxznr" (UID: "dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e") : secret "marketplace-operator-metrics" not found Mar 19 09:19:56.921315 master-0 kubenswrapper[3979]: I0319 09:19:56.921100 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:19:56.921315 master-0 kubenswrapper[3979]: I0319 09:19:56.921128 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:19:56.921315 master-0 kubenswrapper[3979]: I0319 09:19:56.921288 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:19:56.921990 master-0 kubenswrapper[3979]: E0319 09:19:56.921305 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:56.921990 master-0 kubenswrapper[3979]: E0319 09:19:56.921355 3979 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:56.921990 master-0 kubenswrapper[3979]: E0319 09:19:56.921385 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls podName:16d2930b-486b-492d-983e-c6702d8f53a7 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:00.921375018 +0000 UTC m=+175.964362606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls") pod "dns-operator-9c5679d8f-cbw4r" (UID: "16d2930b-486b-492d-983e-c6702d8f53a7") : secret "metrics-tls" not found Mar 19 09:19:56.921990 master-0 kubenswrapper[3979]: E0319 09:19:56.921437 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert podName:7b29cb7b-26d2-4fab-9e03-2d7fdf937592 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:00.921401909 +0000 UTC m=+175.964389527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert") pod "olm-operator-5c9796789-rh692" (UID: "7b29cb7b-26d2-4fab-9e03-2d7fdf937592") : secret "olm-operator-serving-cert" not found Mar 19 09:19:56.921990 master-0 kubenswrapper[3979]: E0319 09:19:56.921443 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:56.921990 master-0 kubenswrapper[3979]: I0319 09:19:56.921468 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:19:56.921990 master-0 kubenswrapper[3979]: E0319 09:19:56.921502 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert podName:259794ab-d027-497a-b08e-5a6d79057668 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:00.921484631 +0000 UTC m=+175.964472279 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert") pod "catalog-operator-68f85b4d6c-jg9m5" (UID: "259794ab-d027-497a-b08e-5a6d79057668") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:56.921990 master-0 kubenswrapper[3979]: E0319 09:19:56.921553 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:56.921990 master-0 kubenswrapper[3979]: E0319 09:19:56.921605 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert podName:9a6c1523-e77c-4aac-814c-05d41215c42f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:00.921591383 +0000 UTC m=+175.964579031 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-5jsnd" (UID: "9a6c1523-e77c-4aac-814c-05d41215c42f") : secret "package-server-manager-serving-cert" not found Mar 19 09:20:00.867354 master-0 kubenswrapper[3979]: I0319 09:20:00.867285 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:00.867354 master-0 kubenswrapper[3979]: I0319 09:20:00.867372 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:00.868099 master-0 kubenswrapper[3979]: I0319 09:20:00.867401 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:00.868099 master-0 kubenswrapper[3979]: E0319 09:20:00.867461 3979 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:00.868099 master-0 kubenswrapper[3979]: E0319 09:20:00.867500 3979 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:20:00.868099 master-0 kubenswrapper[3979]: E0319 09:20:00.867544 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.867501843 +0000 UTC m=+183.910489421 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:00.868099 master-0 kubenswrapper[3979]: E0319 09:20:00.867562 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls podName:c247d991-809e-46b6-9617-9b05007b7560 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.867553614 +0000 UTC m=+183.910541192 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5m8t6" (UID: "c247d991-809e-46b6-9617-9b05007b7560") : secret "image-registry-operator-tls" not found Mar 19 09:20:00.868099 master-0 kubenswrapper[3979]: E0319 09:20:00.867610 3979 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:20:00.868099 master-0 kubenswrapper[3979]: E0319 09:20:00.867661 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.867645316 +0000 UTC m=+183.910632904 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "node-tuning-operator-tls" not found Mar 19 09:20:00.968467 master-0 kubenswrapper[3979]: I0319 09:20:00.968352 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:20:00.968720 master-0 kubenswrapper[3979]: E0319 09:20:00.968646 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:20:00.968799 master-0 kubenswrapper[3979]: E0319 09:20:00.968739 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert podName:9a6c1523-e77c-4aac-814c-05d41215c42f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.968714832 +0000 UTC m=+184.011702630 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-5jsnd" (UID: "9a6c1523-e77c-4aac-814c-05d41215c42f") : secret "package-server-manager-serving-cert" not found Mar 19 09:20:00.968900 master-0 kubenswrapper[3979]: I0319 09:20:00.968870 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:00.969109 master-0 kubenswrapper[3979]: E0319 09:20:00.969046 3979 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:00.969165 master-0 kubenswrapper[3979]: I0319 09:20:00.969065 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:00.969165 master-0 kubenswrapper[3979]: E0319 09:20:00.969152 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls podName:8c8ee765-76b8-4cde-8acb-6e5edd1b8149 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.969132032 +0000 UTC m=+184.012119610 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-rtzvj" (UID: "8c8ee765-76b8-4cde-8acb-6e5edd1b8149") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:00.969246 master-0 kubenswrapper[3979]: I0319 09:20:00.969180 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:20:00.969246 master-0 kubenswrapper[3979]: I0319 09:20:00.969204 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:00.969326 master-0 kubenswrapper[3979]: E0319 09:20:00.969237 3979 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:20:00.969326 master-0 kubenswrapper[3979]: I0319 09:20:00.969295 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:20:00.969392 master-0 kubenswrapper[3979]: E0319 09:20:00.969329 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics podName:dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.969294446 +0000 UTC m=+184.012282184 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-gxznr" (UID: "dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e") : secret "marketplace-operator-metrics" not found Mar 19 09:20:00.969392 master-0 kubenswrapper[3979]: E0319 09:20:00.969354 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:20:00.969392 master-0 kubenswrapper[3979]: E0319 09:20:00.969386 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert podName:7b29cb7b-26d2-4fab-9e03-2d7fdf937592 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.969376638 +0000 UTC m=+184.012364416 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert") pod "olm-operator-5c9796789-rh692" (UID: "7b29cb7b-26d2-4fab-9e03-2d7fdf937592") : secret "olm-operator-serving-cert" not found Mar 19 09:20:00.969392 master-0 kubenswrapper[3979]: I0319 09:20:00.969356 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:20:00.969575 master-0 kubenswrapper[3979]: E0319 09:20:00.969400 3979 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:20:00.969575 master-0 kubenswrapper[3979]: I0319 09:20:00.969415 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:20:00.969575 master-0 kubenswrapper[3979]: E0319 09:20:00.969429 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert podName:259794ab-d027-497a-b08e-5a6d79057668 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.969422009 +0000 UTC m=+184.012409587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert") pod "catalog-operator-68f85b4d6c-jg9m5" (UID: "259794ab-d027-497a-b08e-5a6d79057668") : secret "catalog-operator-serving-cert" not found Mar 19 09:20:00.969575 master-0 kubenswrapper[3979]: E0319 09:20:00.969485 3979 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:20:00.969575 master-0 kubenswrapper[3979]: E0319 09:20:00.969503 3979 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:00.969575 master-0 kubenswrapper[3979]: E0319 09:20:00.969575 3979 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:00.969791 master-0 kubenswrapper[3979]: E0319 09:20:00.969520 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs podName:8beda3a0-a653-4810-b3f2-d25badb21ab1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.969511311 +0000 UTC m=+184.012499099 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-fvh8d" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1") : secret "multus-admission-controller-secret" not found Mar 19 09:20:00.969791 master-0 kubenswrapper[3979]: E0319 09:20:00.969610 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls podName:16d2930b-486b-492d-983e-c6702d8f53a7 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.969601244 +0000 UTC m=+184.012588812 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls") pod "dns-operator-9c5679d8f-cbw4r" (UID: "16d2930b-486b-492d-983e-c6702d8f53a7") : secret "metrics-tls" not found Mar 19 09:20:00.969791 master-0 kubenswrapper[3979]: E0319 09:20:00.969625 3979 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls podName:03d12dab-1215-4c1f-a9f5-27ea7174d308 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.969617594 +0000 UTC m=+184.012605172 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls") pod "ingress-operator-66b84d69b-rvwfh" (UID: "03d12dab-1215-4c1f-a9f5-27ea7174d308") : secret "metrics-tls" not found Mar 19 09:20:02.332194 master-0 kubenswrapper[3979]: I0319 09:20:02.329150 3979 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" podStartSLOduration=134.329128934 podStartE2EDuration="2m14.329128934s" podCreationTimestamp="2026-03-19 09:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:20:02.323568062 +0000 UTC m=+177.366555650" watchObservedRunningTime="2026-03-19 09:20:02.329128934 +0000 UTC m=+177.372116512" Mar 19 09:20:03.301369 master-0 kubenswrapper[3979]: I0319 09:20:03.301048 3979 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:20:03.310467 master-0 kubenswrapper[3979]: I0319 09:20:03.310420 3979 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:20:03.595284 master-0 kubenswrapper[3979]: I0319 09:20:03.595167 3979 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:20:04.072813 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 19 09:20:04.092341 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 19 09:20:04.092592 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 19 09:20:04.096800 master-0 systemd[1]: kubelet.service: Consumed 10.191s CPU time. Mar 19 09:20:04.112274 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 19 09:20:04.211576 master-0 kubenswrapper[7457]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:20:04.211576 master-0 kubenswrapper[7457]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 09:20:04.211576 master-0 kubenswrapper[7457]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:20:04.211576 master-0 kubenswrapper[7457]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:20:04.211576 master-0 kubenswrapper[7457]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 09:20:04.211576 master-0 kubenswrapper[7457]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:20:04.213049 master-0 kubenswrapper[7457]: I0319 09:20:04.211650 7457 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 09:20:04.214098 master-0 kubenswrapper[7457]: W0319 09:20:04.214055 7457 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:20:04.214098 master-0 kubenswrapper[7457]: W0319 09:20:04.214082 7457 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:20:04.214098 master-0 kubenswrapper[7457]: W0319 09:20:04.214088 7457 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:20:04.214098 master-0 kubenswrapper[7457]: W0319 09:20:04.214094 7457 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:20:04.214098 master-0 kubenswrapper[7457]: W0319 09:20:04.214100 7457 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:20:04.214098 master-0 kubenswrapper[7457]: W0319 09:20:04.214105 7457 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:20:04.214098 master-0 kubenswrapper[7457]: W0319 09:20:04.214111 7457 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214118 7457 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214124 7457 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214128 7457 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214133 7457 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214137 7457 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214143 7457 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214148 7457 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214161 7457 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214167 7457 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214171 7457 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214178 7457 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214184 7457 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214189 7457 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214194 7457 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214199 7457 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214204 7457 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214209 7457 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214214 7457 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214218 7457 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:20:04.214366 master-0 kubenswrapper[7457]: W0319 09:20:04.214222 7457 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214226 7457 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214231 7457 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214235 7457 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214239 7457 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214244 7457 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214248 7457 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214254 7457 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214258 7457 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214262 7457 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214267 7457 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214272 7457 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214276 7457 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214281 7457 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214285 7457 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214290 7457 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214294 7457 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214298 7457 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214302 7457 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:20:04.215184 master-0 kubenswrapper[7457]: W0319 09:20:04.214308 7457 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214316 7457 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214320 7457 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214325 7457 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214330 7457 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214336 7457 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214341 7457 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214346 7457 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214350 7457 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214355 7457 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214361 7457 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214365 7457 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214371 7457 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214376 7457 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214382 7457 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214387 7457 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214392 7457 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214396 7457 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214401 7457 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214405 7457 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:20:04.215782 master-0 kubenswrapper[7457]: W0319 09:20:04.214411 7457 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: W0319 09:20:04.214416 7457 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: W0319 09:20:04.214420 7457 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: W0319 09:20:04.214426 7457 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: W0319 09:20:04.214431 7457 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: W0319 09:20:04.214436 7457 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: W0319 09:20:04.214441 7457 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214585 7457 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214597 7457 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214611 7457 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214619 7457 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214625 7457 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214631 7457 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214638 7457 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214644 7457 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214650 7457 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214655 7457 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214661 7457 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214667 7457 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214673 7457 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214678 7457 flags.go:64] FLAG: --cgroup-root="" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214684 7457 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 09:20:04.216843 master-0 kubenswrapper[7457]: I0319 09:20:04.214689 7457 flags.go:64] FLAG: --client-ca-file="" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214694 7457 flags.go:64] FLAG: --cloud-config="" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214699 7457 flags.go:64] FLAG: --cloud-provider="" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214704 7457 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214710 7457 flags.go:64] FLAG: --cluster-domain="" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214715 7457 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214720 7457 flags.go:64] FLAG: --config-dir="" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214725 7457 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214731 7457 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214738 7457 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214745 7457 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214750 7457 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214755 7457 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214762 7457 flags.go:64] FLAG: --contention-profiling="false" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214767 7457 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214771 7457 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214777 7457 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214782 7457 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214788 7457 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214794 7457 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214798 7457 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214803 7457 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214808 7457 flags.go:64] FLAG: --enable-server="true" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214812 7457 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214819 7457 flags.go:64] FLAG: --event-burst="100" Mar 19 09:20:04.217712 master-0 kubenswrapper[7457]: I0319 09:20:04.214824 7457 flags.go:64] FLAG: --event-qps="50" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214829 7457 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214834 7457 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214839 7457 flags.go:64] FLAG: --eviction-hard="" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214845 7457 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214850 7457 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214855 7457 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214860 7457 flags.go:64] FLAG: --eviction-soft="" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214865 7457 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214869 7457 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214874 7457 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214879 7457 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214884 7457 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214889 7457 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214895 7457 flags.go:64] FLAG: --feature-gates="" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214901 7457 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214906 7457 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214913 7457 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214919 7457 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214925 7457 flags.go:64] FLAG: --healthz-port="10248" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214930 7457 flags.go:64] FLAG: --help="false" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214936 7457 flags.go:64] FLAG: --hostname-override="" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214941 7457 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214947 7457 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214953 7457 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 09:20:04.218545 master-0 kubenswrapper[7457]: I0319 09:20:04.214958 7457 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.214963 7457 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.214968 7457 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.214973 7457 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.214979 7457 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.214984 7457 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.214989 7457 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.214994 7457 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.214999 7457 flags.go:64] FLAG: --kube-reserved="" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215004 7457 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215009 7457 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215014 7457 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215019 7457 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215024 7457 flags.go:64] FLAG: --lock-file="" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215028 7457 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215033 7457 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215038 7457 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215046 7457 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215050 7457 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215055 7457 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215060 7457 flags.go:64] FLAG: --logging-format="text" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215065 7457 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215070 7457 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215075 7457 flags.go:64] FLAG: --manifest-url="" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215086 7457 flags.go:64] FLAG: --manifest-url-header="" Mar 19 09:20:04.219260 master-0 kubenswrapper[7457]: I0319 09:20:04.215092 7457 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215098 7457 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215105 7457 flags.go:64] FLAG: --max-pods="110" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215110 7457 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215115 7457 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215120 7457 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215124 7457 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215130 7457 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215135 7457 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215140 7457 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215152 7457 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215157 7457 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215162 7457 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215167 7457 flags.go:64] FLAG: --pod-cidr="" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215172 7457 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215180 7457 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215184 7457 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215191 7457 flags.go:64] FLAG: --pods-per-core="0" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215196 7457 flags.go:64] FLAG: --port="10250" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215202 7457 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215207 7457 flags.go:64] FLAG: --provider-id="" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215213 7457 flags.go:64] FLAG: --qos-reserved="" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215219 7457 flags.go:64] FLAG: --read-only-port="10255" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215225 7457 flags.go:64] FLAG: --register-node="true" Mar 19 09:20:04.219990 master-0 kubenswrapper[7457]: I0319 09:20:04.215230 7457 flags.go:64] FLAG: --register-schedulable="true" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215236 7457 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215246 7457 flags.go:64] FLAG: --registry-burst="10" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215251 7457 flags.go:64] FLAG: --registry-qps="5" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215255 7457 flags.go:64] FLAG: --reserved-cpus="" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215259 7457 flags.go:64] FLAG: --reserved-memory="" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215265 7457 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215271 7457 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215276 7457 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215281 7457 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215285 7457 flags.go:64] FLAG: --runonce="false" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215289 7457 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215293 7457 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215298 7457 flags.go:64] FLAG: --seccomp-default="false" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215302 7457 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215306 7457 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215310 7457 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215315 7457 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215319 7457 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215324 7457 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215328 7457 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215332 7457 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215337 7457 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215342 7457 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215346 7457 flags.go:64] FLAG: --system-cgroups="" Mar 19 09:20:04.220710 master-0 kubenswrapper[7457]: I0319 09:20:04.215350 7457 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215357 7457 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215361 7457 flags.go:64] FLAG: --tls-cert-file="" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215365 7457 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215370 7457 flags.go:64] FLAG: --tls-min-version="" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215374 7457 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215378 7457 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215382 7457 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215387 7457 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215391 7457 flags.go:64] FLAG: --v="2" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215396 7457 flags.go:64] FLAG: --version="false" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215402 7457 flags.go:64] FLAG: --vmodule="" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215407 7457 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: I0319 09:20:04.215411 7457 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: W0319 09:20:04.215549 7457 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: W0319 09:20:04.215556 7457 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: W0319 09:20:04.215560 7457 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: W0319 09:20:04.215565 7457 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: W0319 09:20:04.215569 7457 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: W0319 09:20:04.215574 7457 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: W0319 09:20:04.215579 7457 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:20:04.221565 master-0 kubenswrapper[7457]: W0319 09:20:04.215584 7457 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215589 7457 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215594 7457 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215600 7457 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215605 7457 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215609 7457 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215614 7457 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215618 7457 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215623 7457 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215628 7457 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215631 7457 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215635 7457 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215639 7457 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215643 7457 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215647 7457 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215651 7457 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215654 7457 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215658 7457 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215661 7457 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215665 7457 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:20:04.222352 master-0 kubenswrapper[7457]: W0319 09:20:04.215669 7457 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215672 7457 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215676 7457 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215680 7457 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215684 7457 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215690 7457 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215694 7457 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215697 7457 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215701 7457 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215704 7457 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215708 7457 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215712 7457 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215715 7457 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215719 7457 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215722 7457 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215726 7457 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215729 7457 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215733 7457 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215736 7457 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215740 7457 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:20:04.223249 master-0 kubenswrapper[7457]: W0319 09:20:04.215744 7457 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215748 7457 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215752 7457 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215756 7457 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215760 7457 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215763 7457 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215767 7457 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215770 7457 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215774 7457 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215778 7457 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215781 7457 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215786 7457 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215790 7457 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215794 7457 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215799 7457 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215803 7457 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215807 7457 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215814 7457 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215818 7457 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215822 7457 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:20:04.223999 master-0 kubenswrapper[7457]: W0319 09:20:04.215826 7457 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:20:04.224695 master-0 kubenswrapper[7457]: W0319 09:20:04.215829 7457 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:20:04.224695 master-0 kubenswrapper[7457]: W0319 09:20:04.215833 7457 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:20:04.224695 master-0 kubenswrapper[7457]: W0319 09:20:04.215837 7457 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:20:04.224695 master-0 kubenswrapper[7457]: W0319 09:20:04.215840 7457 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:20:04.224695 master-0 kubenswrapper[7457]: I0319 09:20:04.215852 7457 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:20:04.226024 master-0 kubenswrapper[7457]: I0319 09:20:04.225949 7457 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 19 09:20:04.226024 master-0 kubenswrapper[7457]: I0319 09:20:04.226009 7457 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 09:20:04.226151 master-0 kubenswrapper[7457]: W0319 09:20:04.226113 7457 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:20:04.226151 master-0 kubenswrapper[7457]: W0319 09:20:04.226132 7457 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:20:04.226151 master-0 kubenswrapper[7457]: W0319 09:20:04.226141 7457 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:20:04.226151 master-0 kubenswrapper[7457]: W0319 09:20:04.226147 7457 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:20:04.226151 master-0 kubenswrapper[7457]: W0319 09:20:04.226153 7457 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:20:04.226151 master-0 kubenswrapper[7457]: W0319 09:20:04.226160 7457 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226168 7457 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226178 7457 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226184 7457 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226190 7457 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226196 7457 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226204 7457 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226210 7457 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226217 7457 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226224 7457 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226231 7457 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226238 7457 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226245 7457 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226252 7457 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226259 7457 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226268 7457 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226275 7457 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226282 7457 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226289 7457 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226296 7457 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:20:04.226347 master-0 kubenswrapper[7457]: W0319 09:20:04.226303 7457 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226310 7457 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226319 7457 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226332 7457 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226341 7457 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226349 7457 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226358 7457 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226367 7457 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226375 7457 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226384 7457 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226392 7457 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226400 7457 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226408 7457 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226415 7457 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226421 7457 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226427 7457 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226433 7457 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226481 7457 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226487 7457 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:20:04.226954 master-0 kubenswrapper[7457]: W0319 09:20:04.226492 7457 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226498 7457 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226503 7457 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226509 7457 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226516 7457 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226541 7457 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226548 7457 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226556 7457 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226563 7457 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226568 7457 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226574 7457 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226579 7457 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226585 7457 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226591 7457 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226597 7457 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226603 7457 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226608 7457 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226613 7457 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226620 7457 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:20:04.227586 master-0 kubenswrapper[7457]: W0319 09:20:04.226625 7457 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226630 7457 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226635 7457 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226642 7457 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226649 7457 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226658 7457 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226671 7457 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226678 7457 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226685 7457 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: I0319 09:20:04.226698 7457 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226868 7457 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226879 7457 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226885 7457 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226892 7457 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226898 7457 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:20:04.228169 master-0 kubenswrapper[7457]: W0319 09:20:04.226903 7457 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226909 7457 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226914 7457 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226919 7457 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226925 7457 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226930 7457 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226935 7457 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226942 7457 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226947 7457 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226953 7457 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226961 7457 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226968 7457 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226983 7457 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.226993 7457 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.227000 7457 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.227007 7457 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.227014 7457 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.227020 7457 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.227026 7457 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.227033 7457 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:20:04.228680 master-0 kubenswrapper[7457]: W0319 09:20:04.227040 7457 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227047 7457 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227054 7457 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227062 7457 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227068 7457 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227074 7457 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227080 7457 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227086 7457 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227092 7457 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227098 7457 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227105 7457 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227114 7457 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227122 7457 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227129 7457 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227135 7457 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227143 7457 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227150 7457 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227157 7457 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227163 7457 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227169 7457 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:20:04.229336 master-0 kubenswrapper[7457]: W0319 09:20:04.227179 7457 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227187 7457 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227193 7457 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227198 7457 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227204 7457 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227209 7457 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227215 7457 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227220 7457 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227227 7457 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227234 7457 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227239 7457 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227245 7457 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227251 7457 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227257 7457 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227264 7457 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227271 7457 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227277 7457 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227283 7457 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227289 7457 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:20:04.229975 master-0 kubenswrapper[7457]: W0319 09:20:04.227295 7457 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: W0319 09:20:04.227301 7457 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: W0319 09:20:04.227306 7457 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: W0319 09:20:04.227312 7457 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: W0319 09:20:04.227318 7457 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: W0319 09:20:04.227324 7457 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: W0319 09:20:04.227329 7457 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: W0319 09:20:04.227334 7457 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: I0319 09:20:04.227355 7457 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: I0319 09:20:04.227636 7457 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: I0319 09:20:04.229871 7457 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: I0319 09:20:04.229995 7457 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: I0319 09:20:04.230308 7457 server.go:997] "Starting client certificate rotation" Mar 19 09:20:04.230569 master-0 kubenswrapper[7457]: I0319 09:20:04.230322 7457 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 09:20:04.230941 master-0 kubenswrapper[7457]: I0319 09:20:04.230577 7457 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-20 09:08:48 +0000 UTC, rotation deadline is 2026-03-20 04:29:43.247954262 +0000 UTC Mar 19 09:20:04.230941 master-0 kubenswrapper[7457]: I0319 09:20:04.230648 7457 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 19h9m39.017309761s for next certificate rotation Mar 19 09:20:04.231148 master-0 kubenswrapper[7457]: I0319 09:20:04.231103 7457 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:20:04.232888 master-0 kubenswrapper[7457]: I0319 09:20:04.232815 7457 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:20:04.236790 master-0 kubenswrapper[7457]: I0319 09:20:04.236758 7457 log.go:25] "Validated CRI v1 runtime API" Mar 19 09:20:04.238959 master-0 kubenswrapper[7457]: I0319 09:20:04.238900 7457 log.go:25] "Validated CRI v1 image API" Mar 19 09:20:04.240327 master-0 kubenswrapper[7457]: I0319 09:20:04.239829 7457 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 09:20:04.247503 master-0 kubenswrapper[7457]: I0319 09:20:04.243236 7457 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 aae93335-158a-444f-870b-34679824b626:/dev/vda3] Mar 19 09:20:04.247799 master-0 kubenswrapper[7457]: I0319 09:20:04.243270 7457 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0865b3dd8e414cf36fa73f2f26a1125029b6401943086385aaef6e6adbd387e7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0865b3dd8e414cf36fa73f2f26a1125029b6401943086385aaef6e6adbd387e7/userdata/shm major:0 minor:267 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0ee32bb670dc76513805d1b62d5fffdb198c07008f45dbefa73a8b74cfb40229/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0ee32bb670dc76513805d1b62d5fffdb198c07008f45dbefa73a8b74cfb40229/userdata/shm major:0 minor:249 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1529b41a10d1658d384e0b7a36c11f0035fc8f768b5a9de54629908bbe77762e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1529b41a10d1658d384e0b7a36c11f0035fc8f768b5a9de54629908bbe77762e/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1885558dee49f6f6ad4666eff4afb57c213620724cc5285f30bbd5409ae9582e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1885558dee49f6f6ad4666eff4afb57c213620724cc5285f30bbd5409ae9582e/userdata/shm major:0 minor:253 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1c57ea8e09d1325e300d649233ce1b17315dc34efabdbd2fd35d3a0b5c00a757/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1c57ea8e09d1325e300d649233ce1b17315dc34efabdbd2fd35d3a0b5c00a757/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/205c73b1ea5a301df50c88c2833b1992d29a39f06232166d5125d802ffe3e979/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/205c73b1ea5a301df50c88c2833b1992d29a39f06232166d5125d802ffe3e979/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/310348963a49f41d34871a4c0d732a2191aaea2d2db0ebbe19d1390098835ced/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/310348963a49f41d34871a4c0d732a2191aaea2d2db0ebbe19d1390098835ced/userdata/shm major:0 minor:259 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/44d9515c76d2b5369510d3737c2fe1814c5099a9199ebffb839eb4e657e0735e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/44d9515c76d2b5369510d3737c2fe1814c5099a9199ebffb839eb4e657e0735e/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4a1c72bc9f1c7efb1bcafcf4f7660e88081a0397f913b23e1285005ab7524d43/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4a1c72bc9f1c7efb1bcafcf4f7660e88081a0397f913b23e1285005ab7524d43/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/58a6496fefda9dc10f2cd3d711f675ec3a41cf0c8719af9244e86cc4f0694683/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/58a6496fefda9dc10f2cd3d711f675ec3a41cf0c8719af9244e86cc4f0694683/userdata/shm major:0 minor:247 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5f0b7606a412dcca4dd370553910b12ad443e3587ee9a8d70a1100b889c51bbc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5f0b7606a412dcca4dd370553910b12ad443e3587ee9a8d70a1100b889c51bbc/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/628162d008ef66056da78d4bcff9fb80227ffcc627a246a21dbba2cd871accd4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/628162d008ef66056da78d4bcff9fb80227ffcc627a246a21dbba2cd871accd4/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/72c2a8691354c4c557c4f66cfa7db93075f810bfaffc8fba5e2d6aab857f58a8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/72c2a8691354c4c557c4f66cfa7db93075f810bfaffc8fba5e2d6aab857f58a8/userdata/shm major:0 minor:269 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8f416a5a9dd7c863825858501cc1e0dcef058160b507e9a5e5d82fab9e9dd0c1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8f416a5a9dd7c863825858501cc1e0dcef058160b507e9a5e5d82fab9e9dd0c1/userdata/shm major:0 minor:105 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/944648f39111fd7c1a6ed081666cf0303ca2a6eb595623e82619c7478d3372ab/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/944648f39111fd7c1a6ed081666cf0303ca2a6eb595623e82619c7478d3372ab/userdata/shm major:0 minor:265 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/963f71e764d046880085fa5f09ddf4d6f88636354e79d8ab2e64d52ec74b74ae/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/963f71e764d046880085fa5f09ddf4d6f88636354e79d8ab2e64d52ec74b74ae/userdata/shm major:0 minor:257 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9afd6ea2c1d8f05e8e4fc03f47178ac0a2f4931512d72e1dd34b6edbe52cf174/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9afd6ea2c1d8f05e8e4fc03f47178ac0a2f4931512d72e1dd34b6edbe52cf174/userdata/shm major:0 minor:109 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a68ad4116cab88705ddf2fb479c6fa07f6cc567a78a2d33208b00017ebb5225f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a68ad4116cab88705ddf2fb479c6fa07f6cc567a78a2d33208b00017ebb5225f/userdata/shm major:0 minor:263 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c67ba3f4e9bb95eef468edeb24c18cd6982feefa1823f748db64378aa999c140/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c67ba3f4e9bb95eef468edeb24c18cd6982feefa1823f748db64378aa999c140/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c8885dfa43b9e4c2a58db6e5ff12c1dfdfe9193837daeb55173993661ea9f46a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c8885dfa43b9e4c2a58db6e5ff12c1dfdfe9193837daeb55173993661ea9f46a/userdata/shm major:0 minor:251 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e3a470e3bacc4ee90522d655c1cb49f2266b41a208ae2967afd423c830e462e3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e3a470e3bacc4ee90522d655c1cb49f2266b41a208ae2967afd423c830e462e3/userdata/shm major:0 minor:255 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ef86c160aaf4ed6a2febd660641341c71096c2c568217ab433cd656af3876942/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ef86c160aaf4ed6a2febd660641341c71096c2c568217ab433cd656af3876942/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ff9134bcfbd7c54799a9cf15d6a97a57adcccc8ff7840ea6e6628d638489256c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ff9134bcfbd7c54799a9cf15d6a97a57adcccc8ff7840ea6e6628d638489256c/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~projected/kube-api-access-6bdnt:{mountpoint:/var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~projected/kube-api-access-6bdnt major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/10c609bb-136a-4ce2-b9e2-0a03e1a37a62/volumes/kubernetes.io~projected/kube-api-access-tpgbq:{mountpoint:/var/lib/kubelet/pods/10c609bb-136a-4ce2-b9e2-0a03e1a37a62/volumes/kubernetes.io~projected/kube-api-access-tpgbq major:0 minor:297 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/13072c08-c77c-4170-9ebe-98d63968747b/volumes/kubernetes.io~projected/kube-api-access-clpb5:{mountpoint:/var/lib/kubelet/pods/13072c08-c77c-4170-9ebe-98d63968747b/volumes/kubernetes.io~projected/kube-api-access-clpb5 major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/157e3524-eb27-41ca-b49d-2697ee1245ca/volumes/kubernetes.io~projected/kube-api-access-qhzsr:{mountpoint:/var/lib/kubelet/pods/157e3524-eb27-41ca-b49d-2697ee1245ca/volumes/kubernetes.io~projected/kube-api-access-qhzsr major:0 minor:103 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4/volumes/kubernetes.io~projected/kube-api-access-9blbc:{mountpoint:/var/lib/kubelet/pods/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4/volumes/kubernetes.io~projected/kube-api-access-9blbc major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/16d2930b-486b-492d-983e-c6702d8f53a7/volumes/kubernetes.io~projected/kube-api-access-h5hk6:{mountpoint:/var/lib/kubelet/pods/16d2930b-486b-492d-983e-c6702d8f53a7/volumes/kubernetes.io~projected/kube-api-access-h5hk6 major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/259794ab-d027-497a-b08e-5a6d79057668/volumes/kubernetes.io~projected/kube-api-access-6v88k:{mountpoint:/var/lib/kubelet/pods/259794ab-d027-497a-b08e-5a6d79057668/volumes/kubernetes.io~projected/kube-api-access-6v88k major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b333a1e-2a7f-423a-8b40-99f30c89f740/volumes/kubernetes.io~projected/kube-api-access-xvd6f:{mountpoint:/var/lib/kubelet/pods/3b333a1e-2a7f-423a-8b40-99f30c89f740/volumes/kubernetes.io~projected/kube-api-access-xvd6f major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b333a1e-2a7f-423a-8b40-99f30c89f740/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3b333a1e-2a7f-423a-8b40-99f30c89f740/volumes/kubernetes.io~secret/serving-cert major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b50118d-f7c2-4bff-aca0-5c6623819baf/volumes/kubernetes.io~projected/kube-api-access-6rqsq:{mountpoint:/var/lib/kubelet/pods/3b50118d-f7c2-4bff-aca0-5c6623819baf/volumes/kubernetes.io~projected/kube-api-access-6rqsq major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b50118d-f7c2-4bff-aca0-5c6623819baf/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/3b50118d-f7c2-4bff-aca0-5c6623819baf/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/41659a48-5eea-41cd-8b2a-b683dc15cc11/volumes/kubernetes.io~projected/kube-api-access-jtw68:{mountpoint:/var/lib/kubelet/pods/41659a48-5eea-41cd-8b2a-b683dc15cc11/volumes/kubernetes.io~projected/kube-api-access-jtw68 major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/41659a48-5eea-41cd-8b2a-b683dc15cc11/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/41659a48-5eea-41cd-8b2a-b683dc15cc11/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/43cb2a3b-40e2-45ee-894a-6c833ee17efd/volumes/kubernetes.io~projected/kube-api-access-vf6dq:{mountpoint:/var/lib/kubelet/pods/43cb2a3b-40e2-45ee-894a-6c833ee17efd/volumes/kubernetes.io~projected/kube-api-access-vf6dq major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/43cb2a3b-40e2-45ee-894a-6c833ee17efd/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/43cb2a3b-40e2-45ee-894a-6c833ee17efd/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4abcf2ea-50f5-4d62-8a23-583438e5b451/volumes/kubernetes.io~projected/kube-api-access-2hnvh:{mountpoint:/var/lib/kubelet/pods/4abcf2ea-50f5-4d62-8a23-583438e5b451/volumes/kubernetes.io~projected/kube-api-access-2hnvh major:0 minor:102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4abcf2ea-50f5-4d62-8a23-583438e5b451/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/4abcf2ea-50f5-4d62-8a23-583438e5b451/volumes/kubernetes.io~secret/metrics-tls major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4f65184f-8fc2-4656-8776-a3b962aa1f5d/volumes/kubernetes.io~projected/kube-api-access-j65pb:{mountpoint:/var/lib/kubelet/pods/4f65184f-8fc2-4656-8776-a3b962aa1f5d/volumes/kubernetes.io~projected/kube-api-access-j65pb major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/51b88818-5108-40db-90c8-4f2e7198959e/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/51b88818-5108-40db-90c8-4f2e7198959e/volumes/kubernetes.io~projected/kube-api-access major:0 minor:104 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58ea8fcc-29b2-48ef-8629-2ba217c9d70c/volumes/kubernetes.io~projected/kube-api-access-sfq74:{mountpoint:/var/lib/kubelet/pods/58ea8fcc-29b2-48ef-8629-2ba217c9d70c/volumes/kubernetes.io~projected/kube-api-access-sfq74 major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58ea8fcc-29b2-48ef-8629-2ba217c9d70c/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/58ea8fcc-29b2-48ef-8629-2ba217c9d70c/volumes/kubernetes.io~secret/webhook-cert major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7b29cb7b-26d2-4fab-9e03-2d7fdf937592/volumes/kubernetes.io~projected/kube-api-access-8hw6b:{mountpoint:/var/lib/kubelet/pods/7b29cb7b-26d2-4fab-9e03-2d7fdf937592/volumes/kubernetes.io~projected/kube-api-access-8hw6b major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8527f5cd-2992-44be-90b8-e9086cedf46e/volumes/kubernetes.io~projected/kube-api-access-qp9jf:{mountpoint:/var/lib/kubelet/pods/8527f5cd-2992-44be-90b8-e9086cedf46e/volumes/kubernetes.io~projected/kube-api-access-qp9jf major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8527f5cd-2992-44be-90b8-e9086cedf46e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/8527f5cd-2992-44be-90b8-e9086cedf46e/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8beda3a0-a653-4810-b3f2-d25badb21ab1/volumes/kubernetes.io~projected/kube-api-access-tgtgw:{mountpoint:/var/lib/kubelet/pods/8beda3a0-a653-4810-b3f2-d25badb21ab1/volumes/kubernetes.io~projected/kube-api-access-tgtgw major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c8ee765-76b8-4cde-8acb-6e5edd1b8149/volumes/kubernetes.io~projected/kube-api-access-djxfs:{mountpoint:/var/lib/kubelet/pods/8c8ee765-76b8-4cde-8acb-6e5edd1b8149/volumes/kubernetes.io~projected/kube-api-access-djxfs major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8e073eb4-67f2-4de7-8848-50da73079dbc/volumes/kubernetes.io~projected/kube-api-access-9plst:{mountpoint:/var/lib/kubelet/pods/8e073eb4-67f2-4de7-8848-50da73079dbc/volumes/kubernetes.io~projected/kube-api-access-9plst major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9076d131-644a-4332-8a70-34f6b0f71575/volumes/kubernetes.io~projected/kube-api-access-2vcf6:{mountpoint:/var/lib/kubelet/pods/9076d131-644a-4332-8a70-34f6b0f71575/volumes/kubernetes.io~projected/kube-api-access-2vcf6 major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/979d4d12-a560-4309-a1d3-cbebe853e8ea/volumes/kubernetes.io~projected/kube-api-access-rxjqg:{mountpoint:/var/lib/kubelet/pods/979d4d12-a560-4309-a1d3-cbebe853e8ea/volumes/kubernetes.io~projected/kube-api-access-rxjqg major:0 minor:117 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9a6c1523-e77c-4aac-814c-05d41215c42f/volumes/kubernetes.io~projected/kube-api-access-m6tp5:{mountpoint:/var/lib/kubelet/pods/9a6c1523-e77c-4aac-814c-05d41215c42f/volumes/kubernetes.io~projected/kube-api-access-m6tp5 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~projected/kube-api-access-vl7t5:{mountpoint:/var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~projected/kube-api-access-vl7t5 major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~secret/etcd-client major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~secret/serving-cert major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~projected/kube-api-access-v4hqj:{mountpoint:/var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~projected/kube-api-access-v4hqj major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d664acc4-ec4f-4078-ae93-404a14ea18fc/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/d664acc4-ec4f-4078-ae93-404a14ea18fc/volumes/kubernetes.io~projected/kube-api-access major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d664acc4-ec4f-4078-ae93-404a14ea18fc/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/d664acc4-ec4f-4078-ae93-404a14ea18fc/volumes/kubernetes.io~secret/serving-cert major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volumes/kubernetes.io~projected/kube-api-access-lvnb9:{mountpoint:/var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volumes/kubernetes.io~projected/kube-api-access-lvnb9 major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e/volumes/kubernetes.io~projected/kube-api-access-w5f5s:{mountpoint:/var/lib/kubelet/pods/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e/volumes/kubernetes.io~projected/kube-api-access-w5f5s major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/volumes/kubernetes.io~projected/kube-api-access major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7fae040-28fa-4d97-8482-fd0dd12cc921/volumes/kubernetes.io~projected/kube-api-access-jqwbw:{mountpoint:/var/lib/kubelet/pods/e7fae040-28fa-4d97-8482-fd0dd12cc921/volumes/kubernetes.io~projected/kube-api-access-jqwbw major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7fae040-28fa-4d97-8482-fd0dd12cc921/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e7fae040-28fa-4d97-8482-fd0dd12cc921/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0c75102-6790-4ed3-84da-61c3611186f8/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/f0c75102-6790-4ed3-84da-61c3611186f8/volumes/kubernetes.io~projected/kube-api-access major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0c75102-6790-4ed3-84da-61c3611186f8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f0c75102-6790-4ed3-84da-61c3611186f8/volumes/kubernetes.io~secret/serving-cert major:0 minor:143 fsType:tmpfs blockSize:0} overlay_0-107:{mountpoint:/var/lib/containers/storage/overlay/49c77cf98e23bcc29d51f526c209e269d2e59f1002100c3a2c756f9d79a7dbd8/merged major:0 minor:107 fsType:overlay blockSize:0} overlay_0-111:{mountpoint:/var/lib/containers/storage/overlay/ab2346c2a8ec1c0db2f0b818783d76a67e678ce7d37dd270feb7117e043ce902/merged major:0 minor:111 fsType:overlay blockSize:0} overlay_0-113:{mountpoint:/var/lib/containers/storage/overlay/2a6b807882ea41798341e6ea6ebb911b5664615f1f391ba8854bb2db814b8480/merged major:0 minor:113 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/fe923a01aa149db76793dc17045917d96873c5b54c995671d24b1e685c601c7b/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-132:{mountpoint:/var/lib/containers/storage/overlay/dee62491a119033bb7d50d887f579ca0f06611088f66c382a8139db78fb342a0/merged major:0 minor:132 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/0311f58b992f13129e1493ef6fb7251826fa67edfc89c277a3273a9f2d178cdd/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/6956d109f581ba0a4aa3e6aaec0cdf0d42d2f31858fc15087ad2cda594012f24/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/7998f4274a2238b57e8b2902714659188437a0dbc960443bab9161f4b65e8cc4/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/bc7f3569f50a9078712d7980f75c32e2da6a0b726696bdccc245957c1a1a632b/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/c14136521213d1eb6853566726554217f34f18787c93b6c20c72e5d75aa253f4/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/3b47c5bde3ec022e7d3889500e412e583a2393d578c856f6b86a67d8fd226ef9/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/71a0c362f9de7056a76fb8bfb3f321eb21c039b7cdbae5721f143ca93dbfcc4c/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-164:{mountpoint:/var/lib/containers/storage/overlay/e40be4c8888cabd2adee5fd441c862a9d1bb3e1da68fe2e07690d50feafc68e0/merged major:0 minor:164 fsType:overlay blockSize:0} overlay_0-166:{mountpoint:/var/lib/containers/storage/overlay/8720e958ef15eb5868a28c074b946011c4fae1d049f4f1ba42a1e3e747870488/merged major:0 minor:166 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/1c91a1d2f63e80990d6a06bb2c0c1b685bda22da0a356a587f803bf87fdcf9a6/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/2f5b274c557c6a951d5978f3b4c87ef9bcd7579a6ee21cbeb34e12d31edcb5a7/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/982312cd7bee1c13f5ba7515790d39a96ed2c1716815f03376d4b1509f945a68/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/48eb8d26c67ca2a4a7f8f50b51c271eddc022981b7eb60ab3392efc2d96388f1/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/5f850bd024508006765b26f8c7d520fb2bb7ad541b2c30275e097ac1314becbe/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/d3e9581fe7ca74c6e2f6f90231a678caafcca776618b5b4fe4a5edec235b56f2/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/051a95d5a1d5c8375cb1419b880419b56ba1dd1729b44dde3605d95ba1c76397/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-271:{mountpoint:/var/lib/containers/storage/overlay/28b65bee44b8e1c6a3b254bee47d55bd95dd6bbad49e9a53f951c1f4ad2f155d/merged major:0 minor:271 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/659a2fb543fdae93daed72c971cd6a80ecd382cbc5ef7b88056d8af40569d1e0/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-275:{mountpoint:/var/lib/containers/storage/overlay/ee3ba172291ca65b21d577bef0420cf4a07656f251ee8cb3594c143e1740acc3/merged major:0 minor:275 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/aba52ee22245adab58a7cb8da8b39a24ae4399c0efc1b34e9910b24ebf42f61e/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/7199d228cf83653f28117985b0b02fdaa27b01fda041902b16e1a069520aef8f/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/36692e533376d710b6ce6d3e9bb4e3555eb67c330c62bd146b67759e013799a6/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/267a5f646eb6b002ddbe15d0b586ee553fb7df87d84ca3689deaa8ef389b5f8d/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/b80865d8109771457ffc35c51dfc77e7d92c2c523057e06b3daff4af56d5fabd/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/2b92d539c573ddd386bf74d2dd789d56593f3985b0738074f3633426d1faea08/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/768745a4a94d5f49e975592d1c011de5ceedf48cd4986660d6917ce8d533c5d6/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/7aa19a329c9e16d2fe6e87e1a54ba48f5e65bfe432c5a57b213a3383e7f4d376/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/a0dae084494ee1f35434a45169f3811c9c66f4b3e2afc9db24b79287438fe9e5/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/a8fee7047db8240d64bbf7d5ee6342c15214c8d307aed58d1789dfe023b523fb/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/814650a8e5a66d48479431897983c3fdd5edb710dbca0978e207c78edc55b856/merged major:0 minor:43 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/22632cf9783626db1e7d71eaacfdb1dd8e5b6feb79bc6c1a4d70b518c27601bd/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/7a15d6eea09ca87a9424954461e9220e477d01093e9149270c5829235dc5c469/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/02e1a750018547395bc48d138c08776366d6b33d72e39c7de28ec98587d8bb4f/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/e57073886d98ec5d8cbc29f2389cff0ccb907d902195081466853ab0b3d980db/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/2ba0c2cb9c02a192cb30a7e41dc61695dfa4ef4406d228b4a33dfb67cef6a0d9/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/8fcf2d53765e3b7d7e22a6d1a6041add7396124d246e47ead277591420c6e5d0/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/8df0b9fa2cae54fcccba2e03c5450bf4528e507efbf98989b713d50be13a3483/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/517759d341fc80f959115d0241fed79b340d98765aeca3a73da7bc64956016cc/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-75:{mountpoint:/var/lib/containers/storage/overlay/1b3328e9dca3701c1d42f379ab9c2d64723902f1217bd4b8338ec13055afa604/merged major:0 minor:75 fsType:overlay blockSize:0} overlay_0-81:{mountpoint:/var/lib/containers/storage/overlay/da4a42a49cc394fe7d55f5d6203c2cf48306f6ef9f8b71e333151e2d85e4932d/merged major:0 minor:81 fsType:overlay blockSize:0} overlay_0-84:{mountpoint:/var/lib/containers/storage/overlay/ccd70c811efe92af2724ff4a9e1d552c8c35638dfa68ccd245da63b9d73053af/merged major:0 minor:84 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/2d4214de4f14b2494cf17c398078427f7cb3e22ef01049432466a1aa15f5ceb5/merged major:0 minor:89 fsType:overlay blockSize:0}] Mar 19 09:20:04.263558 master-0 kubenswrapper[7457]: I0319 09:20:04.263013 7457 manager.go:217] Machine: {Timestamp:2026-03-19 09:20:04.262213688 +0000 UTC m=+0.117553078 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:dab19efcf33543febdac139f3c303589 SystemUUID:dab19efc-f335-43fe-bdac-139f3c303589 BootID:870de220-908c-4452-8349-8f04a86857c3 Filesystems:[{Device:overlay_0-81 DeviceMajor:0 DeviceMinor:81 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~projected/kube-api-access-vl7t5 DeviceMajor:0 DeviceMinor:230 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-113 DeviceMajor:0 DeviceMinor:113 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/41659a48-5eea-41cd-8b2a-b683dc15cc11/volumes/kubernetes.io~projected/kube-api-access-jtw68 DeviceMajor:0 DeviceMinor:125 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/43cb2a3b-40e2-45ee-894a-6c833ee17efd/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f0c75102-6790-4ed3-84da-61c3611186f8/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:242 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a68ad4116cab88705ddf2fb479c6fa07f6cc567a78a2d33208b00017ebb5225f/userdata/shm DeviceMajor:0 DeviceMinor:263 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/157e3524-eb27-41ca-b49d-2697ee1245ca/volumes/kubernetes.io~projected/kube-api-access-qhzsr DeviceMajor:0 DeviceMinor:103 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/3b333a1e-2a7f-423a-8b40-99f30c89f740/volumes/kubernetes.io~projected/kube-api-access-xvd6f DeviceMajor:0 DeviceMinor:234 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9afd6ea2c1d8f05e8e4fc03f47178ac0a2f4931512d72e1dd34b6edbe52cf174/userdata/shm DeviceMajor:0 DeviceMinor:109 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/58ea8fcc-29b2-48ef-8629-2ba217c9d70c/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:139 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~projected/kube-api-access-v4hqj DeviceMajor:0 DeviceMinor:216 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/310348963a49f41d34871a4c0d732a2191aaea2d2db0ebbe19d1390098835ced/userdata/shm DeviceMajor:0 DeviceMinor:259 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/8527f5cd-2992-44be-90b8-e9086cedf46e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:224 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/8c8ee765-76b8-4cde-8acb-6e5edd1b8149/volumes/kubernetes.io~projected/kube-api-access-djxfs DeviceMajor:0 DeviceMinor:229 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c67ba3f4e9bb95eef468edeb24c18cd6982feefa1823f748db64378aa999c140/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-107 DeviceMajor:0 DeviceMinor:107 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f0c75102-6790-4ed3-84da-61c3611186f8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:143 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/3b50118d-f7c2-4bff-aca0-5c6623819baf/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/3b50118d-f7c2-4bff-aca0-5c6623819baf/volumes/kubernetes.io~projected/kube-api-access-6rqsq DeviceMajor:0 DeviceMinor:237 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/4abcf2ea-50f5-4d62-8a23-583438e5b451/volumes/kubernetes.io~projected/kube-api-access-2hnvh DeviceMajor:0 DeviceMinor:102 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1c57ea8e09d1325e300d649233ce1b17315dc34efabdbd2fd35d3a0b5c00a757/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-84 DeviceMajor:0 DeviceMinor:84 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ef86c160aaf4ed6a2febd660641341c71096c2c568217ab433cd656af3876942/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/259794ab-d027-497a-b08e-5a6d79057668/volumes/kubernetes.io~projected/kube-api-access-6v88k DeviceMajor:0 DeviceMinor:232 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/43cb2a3b-40e2-45ee-894a-6c833ee17efd/volumes/kubernetes.io~projected/kube-api-access-vf6dq DeviceMajor:0 DeviceMinor:235 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ff9134bcfbd7c54799a9cf15d6a97a57adcccc8ff7840ea6e6628d638489256c/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e3a470e3bacc4ee90522d655c1cb49f2266b41a208ae2967afd423c830e462e3/userdata/shm DeviceMajor:0 DeviceMinor:255 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-111 DeviceMajor:0 DeviceMinor:111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:217 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-275 DeviceMajor:0 DeviceMinor:275 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5f0b7606a412dcca4dd370553910b12ad443e3587ee9a8d70a1100b889c51bbc/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d664acc4-ec4f-4078-ae93-404a14ea18fc/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:222 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:243 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/58a6496fefda9dc10f2cd3d711f675ec3a41cf0c8719af9244e86cc4f0694683/userdata/shm DeviceMajor:0 DeviceMinor:247 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/4abcf2ea-50f5-4d62-8a23-583438e5b451/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:98 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8beda3a0-a653-4810-b3f2-d25badb21ab1/volumes/kubernetes.io~projected/kube-api-access-tgtgw DeviceMajor:0 DeviceMinor:238 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/944648f39111fd7c1a6ed081666cf0303ca2a6eb595623e82619c7478d3372ab/userdata/shm DeviceMajor:0 DeviceMinor:265 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/44d9515c76d2b5369510d3737c2fe1814c5099a9199ebffb839eb4e657e0735e/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/e7fae040-28fa-4d97-8482-fd0dd12cc921/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:225 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/7b29cb7b-26d2-4fab-9e03-2d7fdf937592/volumes/kubernetes.io~projected/kube-api-access-8hw6b DeviceMajor:0 DeviceMinor:236 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0ee32bb670dc76513805d1b62d5fffdb198c07008f45dbefa73a8b74cfb40229/userdata/shm DeviceMajor:0 DeviceMinor:249 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d664acc4-ec4f-4078-ae93-404a14ea18fc/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:231 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~projected/kube-api-access-6bdnt DeviceMajor:0 DeviceMinor:244 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0865b3dd8e414cf36fa73f2f26a1125029b6401943086385aaef6e6adbd387e7/userdata/shm DeviceMajor:0 DeviceMinor:267 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4a1c72bc9f1c7efb1bcafcf4f7660e88081a0397f913b23e1285005ab7524d43/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/628162d008ef66056da78d4bcff9fb80227ffcc627a246a21dbba2cd871accd4/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e/volumes/kubernetes.io~projected/kube-api-access-w5f5s DeviceMajor:0 DeviceMinor:240 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-271 DeviceMajor:0 DeviceMinor:271 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1885558dee49f6f6ad4666eff4afb57c213620724cc5285f30bbd5409ae9582e/userdata/shm DeviceMajor:0 DeviceMinor:253 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8f416a5a9dd7c863825858501cc1e0dcef058160b507e9a5e5d82fab9e9dd0c1/userdata/shm DeviceMajor:0 DeviceMinor:105 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-164 DeviceMajor:0 DeviceMinor:164 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/72c2a8691354c4c557c4f66cfa7db93075f810bfaffc8fba5e2d6aab857f58a8/userdata/shm DeviceMajor:0 DeviceMinor:269 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-132 DeviceMajor:0 DeviceMinor:132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:223 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-166 DeviceMajor:0 DeviceMinor:166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8527f5cd-2992-44be-90b8-e9086cedf46e/volumes/kubernetes.io~projected/kube-api-access-qp9jf DeviceMajor:0 DeviceMinor:233 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/963f71e764d046880085fa5f09ddf4d6f88636354e79d8ab2e64d52ec74b74ae/userdata/shm DeviceMajor:0 DeviceMinor:257 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9a6c1523-e77c-4aac-814c-05d41215c42f/volumes/kubernetes.io~projected/kube-api-access-m6tp5 DeviceMajor:0 DeviceMinor:239 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volumes/kubernetes.io~projected/kube-api-access-lvnb9 DeviceMajor:0 DeviceMinor:127 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4/volumes/kubernetes.io~projected/kube-api-access-9blbc DeviceMajor:0 DeviceMinor:227 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3b333a1e-2a7f-423a-8b40-99f30c89f740/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:221 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/10c609bb-136a-4ce2-b9e2-0a03e1a37a62/volumes/kubernetes.io~projected/kube-api-access-tpgbq DeviceMajor:0 DeviceMinor:297 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/8e073eb4-67f2-4de7-8848-50da73079dbc/volumes/kubernetes.io~projected/kube-api-access-9plst DeviceMajor:0 DeviceMinor:246 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-75 DeviceMajor:0 DeviceMinor:75 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e7fae040-28fa-4d97-8482-fd0dd12cc921/volumes/kubernetes.io~projected/kube-api-access-jqwbw DeviceMajor:0 DeviceMinor:228 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/16d2930b-486b-492d-983e-c6702d8f53a7/volumes/kubernetes.io~projected/kube-api-access-h5hk6 DeviceMajor:0 DeviceMinor:245 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/containers/storage/overlay-containers/205c73b1ea5a301df50c88c2833b1992d29a39f06232166d5125d802ffe3e979/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/13072c08-c77c-4170-9ebe-98d63968747b/volumes/kubernetes.io~projected/kube-api-access-clpb5 DeviceMajor:0 DeviceMinor:123 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/41659a48-5eea-41cd-8b2a-b683dc15cc11/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c8885dfa43b9e4c2a58db6e5ff12c1dfdfe9193837daeb55173993661ea9f46a/userdata/shm DeviceMajor:0 DeviceMinor:251 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/51b88818-5108-40db-90c8-4f2e7198959e/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:104 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/58ea8fcc-29b2-48ef-8629-2ba217c9d70c/volumes/kubernetes.io~projected/kube-api-access-sfq74 DeviceMajor:0 DeviceMinor:138 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/979d4d12-a560-4309-a1d3-cbebe853e8ea/volumes/kubernetes.io~projected/kube-api-access-rxjqg DeviceMajor:0 DeviceMinor:117 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1529b41a10d1658d384e0b7a36c11f0035fc8f768b5a9de54629908bbe77762e/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/9076d131-644a-4332-8a70-34f6b0f71575/volumes/kubernetes.io~projected/kube-api-access-2vcf6 DeviceMajor:0 DeviceMinor:226 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/4f65184f-8fc2-4656-8776-a3b962aa1f5d/volumes/kubernetes.io~projected/kube-api-access-j65pb DeviceMajor:0 DeviceMinor:241 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:0865b3dd8e414cf MacAddress:0a:5d:87:16:b6:fa Speed:10000 Mtu:8900} {Name:0ee32bb670dc765 MacAddress:ae:dc:e5:43:41:14 Speed:10000 Mtu:8900} {Name:1885558dee49f6f MacAddress:3e:9b:ec:ce:7c:67 Speed:10000 Mtu:8900} {Name:1c57ea8e09d1325 MacAddress:ae:57:01:f8:2f:9d Speed:10000 Mtu:8900} {Name:310348963a49f41 MacAddress:a6:b9:02:2b:e2:b2 Speed:10000 Mtu:8900} {Name:58a6496fefda9dc MacAddress:b6:48:43:cc:af:d7 Speed:10000 Mtu:8900} {Name:944648f39111fd7 MacAddress:56:2c:ed:07:37:b9 Speed:10000 Mtu:8900} {Name:963f71e764d0468 MacAddress:7e:04:33:90:55:77 Speed:10000 Mtu:8900} {Name:a68ad4116cab887 MacAddress:e2:a1:8f:7b:4f:cc Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:32:ab:c8:1a:45:70 Speed:0 Mtu:8900} {Name:c8885dfa43b9e4c MacAddress:9a:6d:d4:70:ac:82 Speed:10000 Mtu:8900} {Name:e3a470e3bacc4ee MacAddress:d2:33:cb:b5:3c:c8 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:5a:31:1f Speed:-1 Mtu:9000} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:3a:39:e3:03:14:35 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 09:20:04.263558 master-0 kubenswrapper[7457]: I0319 09:20:04.263507 7457 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 09:20:04.263952 master-0 kubenswrapper[7457]: I0319 09:20:04.263806 7457 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 09:20:04.264076 master-0 kubenswrapper[7457]: I0319 09:20:04.264061 7457 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 09:20:04.264277 master-0 kubenswrapper[7457]: I0319 09:20:04.264230 7457 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 09:20:04.264502 master-0 kubenswrapper[7457]: I0319 09:20:04.264272 7457 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 09:20:04.264590 master-0 kubenswrapper[7457]: I0319 09:20:04.264511 7457 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 09:20:04.264590 master-0 kubenswrapper[7457]: I0319 09:20:04.264539 7457 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 09:20:04.264590 master-0 kubenswrapper[7457]: I0319 09:20:04.264551 7457 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:20:04.264590 master-0 kubenswrapper[7457]: I0319 09:20:04.264575 7457 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:20:04.264754 master-0 kubenswrapper[7457]: I0319 09:20:04.264741 7457 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:20:04.264831 master-0 kubenswrapper[7457]: I0319 09:20:04.264821 7457 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 09:20:04.264889 master-0 kubenswrapper[7457]: I0319 09:20:04.264880 7457 kubelet.go:418] "Attempting to sync node with API server" Mar 19 09:20:04.264947 master-0 kubenswrapper[7457]: I0319 09:20:04.264894 7457 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 09:20:04.264947 master-0 kubenswrapper[7457]: I0319 09:20:04.264906 7457 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 09:20:04.264947 master-0 kubenswrapper[7457]: I0319 09:20:04.264918 7457 kubelet.go:324] "Adding apiserver pod source" Mar 19 09:20:04.264947 master-0 kubenswrapper[7457]: I0319 09:20:04.264934 7457 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 09:20:04.266917 master-0 kubenswrapper[7457]: I0319 09:20:04.266879 7457 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 19 09:20:04.267119 master-0 kubenswrapper[7457]: I0319 09:20:04.267099 7457 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 09:20:04.267411 master-0 kubenswrapper[7457]: I0319 09:20:04.267386 7457 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 09:20:04.267561 master-0 kubenswrapper[7457]: I0319 09:20:04.267517 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 09:20:04.267623 master-0 kubenswrapper[7457]: I0319 09:20:04.267569 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 09:20:04.267623 master-0 kubenswrapper[7457]: I0319 09:20:04.267582 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 09:20:04.267623 master-0 kubenswrapper[7457]: I0319 09:20:04.267590 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 09:20:04.267623 master-0 kubenswrapper[7457]: I0319 09:20:04.267596 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 09:20:04.267623 master-0 kubenswrapper[7457]: I0319 09:20:04.267603 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 09:20:04.267623 master-0 kubenswrapper[7457]: I0319 09:20:04.267611 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 09:20:04.267623 master-0 kubenswrapper[7457]: I0319 09:20:04.267618 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 09:20:04.267623 master-0 kubenswrapper[7457]: I0319 09:20:04.267626 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 09:20:04.267870 master-0 kubenswrapper[7457]: I0319 09:20:04.267633 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 09:20:04.267870 master-0 kubenswrapper[7457]: I0319 09:20:04.267643 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 09:20:04.267870 master-0 kubenswrapper[7457]: I0319 09:20:04.267655 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 09:20:04.267870 master-0 kubenswrapper[7457]: I0319 09:20:04.267679 7457 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 09:20:04.268057 master-0 kubenswrapper[7457]: I0319 09:20:04.268036 7457 server.go:1280] "Started kubelet" Mar 19 09:20:04.269145 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 19 09:20:04.270968 master-0 kubenswrapper[7457]: I0319 09:20:04.270908 7457 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 09:20:04.270968 master-0 kubenswrapper[7457]: I0319 09:20:04.270951 7457 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 09:20:04.271103 master-0 kubenswrapper[7457]: I0319 09:20:04.271008 7457 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 19 09:20:04.271550 master-0 kubenswrapper[7457]: I0319 09:20:04.271500 7457 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 09:20:04.274330 master-0 kubenswrapper[7457]: I0319 09:20:04.274215 7457 server.go:449] "Adding debug handlers to kubelet server" Mar 19 09:20:04.274330 master-0 kubenswrapper[7457]: I0319 09:20:04.274240 7457 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 09:20:04.274330 master-0 kubenswrapper[7457]: I0319 09:20:04.274278 7457 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 09:20:04.275813 master-0 kubenswrapper[7457]: I0319 09:20:04.274981 7457 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 09:20:04.275813 master-0 kubenswrapper[7457]: I0319 09:20:04.275014 7457 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 09:20:04.275813 master-0 kubenswrapper[7457]: I0319 09:20:04.275067 7457 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 09:08:48 +0000 UTC, rotation deadline is 2026-03-20 03:12:31.872243835 +0000 UTC Mar 19 09:20:04.275813 master-0 kubenswrapper[7457]: I0319 09:20:04.275109 7457 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h52m27.597137487s for next certificate rotation Mar 19 09:20:04.275813 master-0 kubenswrapper[7457]: E0319 09:20:04.275341 7457 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:20:04.275813 master-0 kubenswrapper[7457]: I0319 09:20:04.275404 7457 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 19 09:20:04.290322 master-0 kubenswrapper[7457]: I0319 09:20:04.289820 7457 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:20:04.290322 master-0 kubenswrapper[7457]: I0319 09:20:04.289953 7457 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:20:04.292946 master-0 kubenswrapper[7457]: I0319 09:20:04.292877 7457 factory.go:55] Registering systemd factory Mar 19 09:20:04.292946 master-0 kubenswrapper[7457]: I0319 09:20:04.292910 7457 factory.go:221] Registration of the systemd container factory successfully Mar 19 09:20:04.295029 master-0 kubenswrapper[7457]: I0319 09:20:04.294973 7457 factory.go:153] Registering CRI-O factory Mar 19 09:20:04.295029 master-0 kubenswrapper[7457]: I0319 09:20:04.295003 7457 factory.go:221] Registration of the crio container factory successfully Mar 19 09:20:04.295257 master-0 kubenswrapper[7457]: I0319 09:20:04.295243 7457 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 09:20:04.295321 master-0 kubenswrapper[7457]: I0319 09:20:04.295288 7457 factory.go:103] Registering Raw factory Mar 19 09:20:04.295321 master-0 kubenswrapper[7457]: I0319 09:20:04.295314 7457 manager.go:1196] Started watching for new ooms in manager Mar 19 09:20:04.295545 master-0 kubenswrapper[7457]: I0319 09:20:04.295501 7457 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:20:04.295739 master-0 kubenswrapper[7457]: I0319 09:20:04.295719 7457 manager.go:319] Starting recovery of all containers Mar 19 09:20:04.298089 master-0 kubenswrapper[7457]: I0319 09:20:04.298000 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e073eb4-67f2-4de7-8848-50da73079dbc" volumeName="kubernetes.io/projected/8e073eb4-67f2-4de7-8848-50da73079dbc-kube-api-access-9plst" seLinuxMountContext="" Mar 19 09:20:04.298089 master-0 kubenswrapper[7457]: I0319 09:20:04.298058 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03d12dab-1215-4c1f-a9f5-27ea7174d308" volumeName="kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-kube-api-access-6bdnt" seLinuxMountContext="" Mar 19 09:20:04.298200 master-0 kubenswrapper[7457]: I0319 09:20:04.298097 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43cb2a3b-40e2-45ee-894a-6c833ee17efd" volumeName="kubernetes.io/projected/43cb2a3b-40e2-45ee-894a-6c833ee17efd-kube-api-access-vf6dq" seLinuxMountContext="" Mar 19 09:20:04.298200 master-0 kubenswrapper[7457]: I0319 09:20:04.298109 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8beda3a0-a653-4810-b3f2-d25badb21ab1" volumeName="kubernetes.io/projected/8beda3a0-a653-4810-b3f2-d25badb21ab1-kube-api-access-tgtgw" seLinuxMountContext="" Mar 19 09:20:04.298200 master-0 kubenswrapper[7457]: I0319 09:20:04.298119 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="157e3524-eb27-41ca-b49d-2697ee1245ca" volumeName="kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-cni-binary-copy" seLinuxMountContext="" Mar 19 09:20:04.298200 master-0 kubenswrapper[7457]: I0319 09:20:04.298130 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8527f5cd-2992-44be-90b8-e9086cedf46e" volumeName="kubernetes.io/projected/8527f5cd-2992-44be-90b8-e9086cedf46e-kube-api-access-qp9jf" seLinuxMountContext="" Mar 19 09:20:04.298200 master-0 kubenswrapper[7457]: I0319 09:20:04.298143 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" volumeName="kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-config" seLinuxMountContext="" Mar 19 09:20:04.298200 master-0 kubenswrapper[7457]: I0319 09:20:04.298155 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" volumeName="kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-env-overrides" seLinuxMountContext="" Mar 19 09:20:04.298200 master-0 kubenswrapper[7457]: I0319 09:20:04.298168 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58ea8fcc-29b2-48ef-8629-2ba217c9d70c" volumeName="kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 09:20:04.298200 master-0 kubenswrapper[7457]: I0319 09:20:04.298179 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="979d4d12-a560-4309-a1d3-cbebe853e8ea" volumeName="kubernetes.io/projected/979d4d12-a560-4309-a1d3-cbebe853e8ea-kube-api-access-rxjqg" seLinuxMountContext="" Mar 19 09:20:04.298200 master-0 kubenswrapper[7457]: I0319 09:20:04.298189 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" volumeName="kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-serving-cert" seLinuxMountContext="" Mar 19 09:20:04.298200 master-0 kubenswrapper[7457]: I0319 09:20:04.298200 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e" volumeName="kubernetes.io/projected/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-kube-api-access-w5f5s" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298214 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7fae040-28fa-4d97-8482-fd0dd12cc921" volumeName="kubernetes.io/secret/e7fae040-28fa-4d97-8482-fd0dd12cc921-serving-cert" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298226 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0c75102-6790-4ed3-84da-61c3611186f8" volumeName="kubernetes.io/secret/f0c75102-6790-4ed3-84da-61c3611186f8-serving-cert" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298237 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1694c93a-9acb-4bec-bfd6-3ec370e7a0b4" volumeName="kubernetes.io/configmap/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-config" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298247 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b333a1e-2a7f-423a-8b40-99f30c89f740" volumeName="kubernetes.io/projected/3b333a1e-2a7f-423a-8b40-99f30c89f740-kube-api-access-xvd6f" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298260 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58ea8fcc-29b2-48ef-8629-2ba217c9d70c" volumeName="kubernetes.io/secret/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-webhook-cert" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298326 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d664acc4-ec4f-4078-ae93-404a14ea18fc" volumeName="kubernetes.io/configmap/d664acc4-ec4f-4078-ae93-404a14ea18fc-config" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298347 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" volumeName="kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298362 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b333a1e-2a7f-423a-8b40-99f30c89f740" volumeName="kubernetes.io/configmap/3b333a1e-2a7f-423a-8b40-99f30c89f740-config" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298373 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58ea8fcc-29b2-48ef-8629-2ba217c9d70c" volumeName="kubernetes.io/projected/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-kube-api-access-sfq74" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298382 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="157e3524-eb27-41ca-b49d-2697ee1245ca" volumeName="kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-daemon-config" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298427 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4f65184f-8fc2-4656-8776-a3b962aa1f5d" volumeName="kubernetes.io/projected/4f65184f-8fc2-4656-8776-a3b962aa1f5d-kube-api-access-j65pb" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298440 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0c75102-6790-4ed3-84da-61c3611186f8" volumeName="kubernetes.io/configmap/f0c75102-6790-4ed3-84da-61c3611186f8-config" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298450 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7fae040-28fa-4d97-8482-fd0dd12cc921" volumeName="kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-config" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298461 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7fae040-28fa-4d97-8482-fd0dd12cc921" volumeName="kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-service-ca-bundle" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298474 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7fae040-28fa-4d97-8482-fd0dd12cc921" volumeName="kubernetes.io/projected/e7fae040-28fa-4d97-8482-fd0dd12cc921-kube-api-access-jqwbw" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298487 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0c75102-6790-4ed3-84da-61c3611186f8" volumeName="kubernetes.io/projected/f0c75102-6790-4ed3-84da-61c3611186f8-kube-api-access" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298539 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4abcf2ea-50f5-4d62-8a23-583438e5b451" volumeName="kubernetes.io/secret/4abcf2ea-50f5-4d62-8a23-583438e5b451-metrics-tls" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298555 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="979d4d12-a560-4309-a1d3-cbebe853e8ea" volumeName="kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298566 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c247d991-809e-46b6-9617-9b05007b7560" volumeName="kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-bound-sa-token" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298576 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51b88818-5108-40db-90c8-4f2e7198959e" volumeName="kubernetes.io/projected/51b88818-5108-40db-90c8-4f2e7198959e-kube-api-access" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298587 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb" volumeName="kubernetes.io/configmap/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-config" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298601 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" volumeName="kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-script-lib" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298612 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e" volumeName="kubernetes.io/configmap/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298624 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b50118d-f7c2-4bff-aca0-5c6623819baf" volumeName="kubernetes.io/secret/3b50118d-f7c2-4bff-aca0-5c6623819baf-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298636 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4f65184f-8fc2-4656-8776-a3b962aa1f5d" volumeName="kubernetes.io/configmap/4f65184f-8fc2-4656-8776-a3b962aa1f5d-iptables-alerter-script" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298647 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c8ee765-76b8-4cde-8acb-6e5edd1b8149" volumeName="kubernetes.io/configmap/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-telemetry-config" seLinuxMountContext="" Mar 19 09:20:04.298638 master-0 kubenswrapper[7457]: I0319 09:20:04.298660 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16d2930b-486b-492d-983e-c6702d8f53a7" volumeName="kubernetes.io/projected/16d2930b-486b-492d-983e-c6702d8f53a7-kube-api-access-h5hk6" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298673 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43cb2a3b-40e2-45ee-894a-6c833ee17efd" volumeName="kubernetes.io/configmap/43cb2a3b-40e2-45ee-894a-6c833ee17efd-config" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298685 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb" volumeName="kubernetes.io/projected/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-kube-api-access" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298698 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="13072c08-c77c-4170-9ebe-98d63968747b" volumeName="kubernetes.io/projected/13072c08-c77c-4170-9ebe-98d63968747b-kube-api-access-clpb5" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298710 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="157e3524-eb27-41ca-b49d-2697ee1245ca" volumeName="kubernetes.io/projected/157e3524-eb27-41ca-b49d-2697ee1245ca-kube-api-access-qhzsr" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298722 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1694c93a-9acb-4bec-bfd6-3ec370e7a0b4" volumeName="kubernetes.io/secret/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-serving-cert" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298733 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58ea8fcc-29b2-48ef-8629-2ba217c9d70c" volumeName="kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-env-overrides" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298745 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" volumeName="kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-config" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298756 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" volumeName="kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-service-ca" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298768 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7fae040-28fa-4d97-8482-fd0dd12cc921" volumeName="kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298781 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03d12dab-1215-4c1f-a9f5-27ea7174d308" volumeName="kubernetes.io/configmap/03d12dab-1215-4c1f-a9f5-27ea7174d308-trusted-ca" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298794 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="41659a48-5eea-41cd-8b2a-b683dc15cc11" volumeName="kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-env-overrides" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298806 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="41659a48-5eea-41cd-8b2a-b683dc15cc11" volumeName="kubernetes.io/projected/41659a48-5eea-41cd-8b2a-b683dc15cc11-kube-api-access-jtw68" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298826 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" volumeName="kubernetes.io/projected/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-kube-api-access-vl7t5" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298845 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c247d991-809e-46b6-9617-9b05007b7560" volumeName="kubernetes.io/configmap/c247d991-809e-46b6-9617-9b05007b7560-trusted-ca" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298860 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb" volumeName="kubernetes.io/secret/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-serving-cert" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298873 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1694c93a-9acb-4bec-bfd6-3ec370e7a0b4" volumeName="kubernetes.io/projected/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-kube-api-access-9blbc" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298886 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="259794ab-d027-497a-b08e-5a6d79057668" volumeName="kubernetes.io/projected/259794ab-d027-497a-b08e-5a6d79057668-kube-api-access-6v88k" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298901 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9a6c1523-e77c-4aac-814c-05d41215c42f" volumeName="kubernetes.io/projected/9a6c1523-e77c-4aac-814c-05d41215c42f-kube-api-access-m6tp5" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298942 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="41659a48-5eea-41cd-8b2a-b683dc15cc11" volumeName="kubernetes.io/secret/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298956 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51b88818-5108-40db-90c8-4f2e7198959e" volumeName="kubernetes.io/configmap/51b88818-5108-40db-90c8-4f2e7198959e-service-ca" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298968 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b29cb7b-26d2-4fab-9e03-2d7fdf937592" volumeName="kubernetes.io/projected/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-kube-api-access-8hw6b" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298980 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="979d4d12-a560-4309-a1d3-cbebe853e8ea" volumeName="kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-binary-copy" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.298992 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d664acc4-ec4f-4078-ae93-404a14ea18fc" volumeName="kubernetes.io/secret/d664acc4-ec4f-4078-ae93-404a14ea18fc-serving-cert" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299006 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03d12dab-1215-4c1f-a9f5-27ea7174d308" volumeName="kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-bound-sa-token" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299017 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b333a1e-2a7f-423a-8b40-99f30c89f740" volumeName="kubernetes.io/secret/3b333a1e-2a7f-423a-8b40-99f30c89f740-serving-cert" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299029 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b50118d-f7c2-4bff-aca0-5c6623819baf" volumeName="kubernetes.io/projected/3b50118d-f7c2-4bff-aca0-5c6623819baf-kube-api-access-6rqsq" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299042 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" volumeName="kubernetes.io/secret/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299053 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" volumeName="kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-client" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299065 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d664acc4-ec4f-4078-ae93-404a14ea18fc" volumeName="kubernetes.io/projected/d664acc4-ec4f-4078-ae93-404a14ea18fc-kube-api-access" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299076 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c8ee765-76b8-4cde-8acb-6e5edd1b8149" volumeName="kubernetes.io/projected/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-kube-api-access-djxfs" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299090 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9076d131-644a-4332-8a70-34f6b0f71575" volumeName="kubernetes.io/configmap/9076d131-644a-4332-8a70-34f6b0f71575-trusted-ca" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299101 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="979d4d12-a560-4309-a1d3-cbebe853e8ea" volumeName="kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299113 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b50118d-f7c2-4bff-aca0-5c6623819baf" volumeName="kubernetes.io/empty-dir/3b50118d-f7c2-4bff-aca0-5c6623819baf-operand-assets" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299123 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4abcf2ea-50f5-4d62-8a23-583438e5b451" volumeName="kubernetes.io/projected/4abcf2ea-50f5-4d62-8a23-583438e5b451-kube-api-access-2hnvh" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299132 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8527f5cd-2992-44be-90b8-e9086cedf46e" volumeName="kubernetes.io/configmap/8527f5cd-2992-44be-90b8-e9086cedf46e-config" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299141 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9076d131-644a-4332-8a70-34f6b0f71575" volumeName="kubernetes.io/projected/9076d131-644a-4332-8a70-34f6b0f71575-kube-api-access-2vcf6" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299149 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" volumeName="kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-ca" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299158 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c247d991-809e-46b6-9617-9b05007b7560" volumeName="kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-kube-api-access-v4hqj" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299166 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" volumeName="kubernetes.io/projected/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-kube-api-access-lvnb9" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299175 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="41659a48-5eea-41cd-8b2a-b683dc15cc11" volumeName="kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovnkube-config" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299186 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43cb2a3b-40e2-45ee-894a-6c833ee17efd" volumeName="kubernetes.io/secret/43cb2a3b-40e2-45ee-894a-6c833ee17efd-serving-cert" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299194 7457 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8527f5cd-2992-44be-90b8-e9086cedf46e" volumeName="kubernetes.io/secret/8527f5cd-2992-44be-90b8-e9086cedf46e-serving-cert" seLinuxMountContext="" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299204 7457 reconstruct.go:97] "Volume reconstruction finished" Mar 19 09:20:04.299395 master-0 kubenswrapper[7457]: I0319 09:20:04.299211 7457 reconciler.go:26] "Reconciler: start to sync state" Mar 19 09:20:04.330830 master-0 kubenswrapper[7457]: I0319 09:20:04.330636 7457 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 09:20:04.333971 master-0 kubenswrapper[7457]: I0319 09:20:04.333931 7457 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 09:20:04.334078 master-0 kubenswrapper[7457]: I0319 09:20:04.333986 7457 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 09:20:04.334078 master-0 kubenswrapper[7457]: I0319 09:20:04.334013 7457 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 09:20:04.334078 master-0 kubenswrapper[7457]: E0319 09:20:04.334062 7457 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 09:20:04.335783 master-0 kubenswrapper[7457]: I0319 09:20:04.335750 7457 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:20:04.340569 master-0 kubenswrapper[7457]: I0319 09:20:04.340498 7457 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="382712d4a8a720b54161d083c15e892932ef38c413a22bb647480e2f84ff33a9" exitCode=1 Mar 19 09:20:04.344820 master-0 kubenswrapper[7457]: I0319 09:20:04.344768 7457 generic.go:334] "Generic (PLEG): container finished" podID="84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" containerID="1e814e1f8603ada52f29b21f78df17d1b4dc0c1bc66fb422a5b77d8e27ae2d59" exitCode=0 Mar 19 09:20:04.348491 master-0 kubenswrapper[7457]: I0319 09:20:04.348449 7457 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="1de935d5d79686ee37ae77f43c7f709d103c6ab561712f1da495ac19ccceba4b" exitCode=0 Mar 19 09:20:04.348491 master-0 kubenswrapper[7457]: I0319 09:20:04.348477 7457 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="3d15d0fa4a3f8c9035c8ce9b72d3cf571d79c5e3676c413632c3d1ba3c37a426" exitCode=0 Mar 19 09:20:04.348491 master-0 kubenswrapper[7457]: I0319 09:20:04.348483 7457 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="2dce09604f673a98b5b76aa5ab393a537cdfc70dd6be1c99472f960c60ad55b9" exitCode=0 Mar 19 09:20:04.348491 master-0 kubenswrapper[7457]: I0319 09:20:04.348490 7457 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="886f43f428dc7d770e78699ea2b9793dc0fcaa7dc9eeaeafd637bd2727c22201" exitCode=0 Mar 19 09:20:04.348491 master-0 kubenswrapper[7457]: I0319 09:20:04.348497 7457 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="e715d0ff200bfc6a3198a0daa26814bad61e6acd8631c88afff9d4a08fe673ba" exitCode=0 Mar 19 09:20:04.348661 master-0 kubenswrapper[7457]: I0319 09:20:04.348504 7457 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="11e09cac68fe5f9a91247cf89d443e062789ce0301fe0e6f213f48df912e0870" exitCode=0 Mar 19 09:20:04.357040 master-0 kubenswrapper[7457]: I0319 09:20:04.356970 7457 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8" exitCode=0 Mar 19 09:20:04.361355 master-0 kubenswrapper[7457]: I0319 09:20:04.361296 7457 generic.go:334] "Generic (PLEG): container finished" podID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" containerID="d486a2c521f4c2c3eb232b1929f8a1ec255878f2382227f7f128e10063843ecc" exitCode=0 Mar 19 09:20:04.370725 master-0 kubenswrapper[7457]: I0319 09:20:04.370686 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzdzd_157e3524-eb27-41ca-b49d-2697ee1245ca/kube-multus/0.log" Mar 19 09:20:04.370725 master-0 kubenswrapper[7457]: I0319 09:20:04.370718 7457 generic.go:334] "Generic (PLEG): container finished" podID="157e3524-eb27-41ca-b49d-2697ee1245ca" containerID="2d3477c3a9725b873c8e5413ca72191db0e07b17ecaa8a6d3f792473fd194137" exitCode=1 Mar 19 09:20:04.374948 master-0 kubenswrapper[7457]: I0319 09:20:04.374889 7457 generic.go:334] "Generic (PLEG): container finished" podID="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" containerID="0ce27311ef590bbffcd62b67c2b6ee4f6f31b7ee4bc36c74deac775d99e52498" exitCode=0 Mar 19 09:20:04.377876 master-0 kubenswrapper[7457]: I0319 09:20:04.377834 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 09:20:04.378269 master-0 kubenswrapper[7457]: I0319 09:20:04.378197 7457 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="ff92ed2afc9866bcdca7010b112a4b7e2fe7402710ba37be20aa1e6f3111dc9b" exitCode=1 Mar 19 09:20:04.378423 master-0 kubenswrapper[7457]: I0319 09:20:04.378269 7457 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="e5e1897ddbf62a1e1975ee8d4b56ad3a8cd0b0cf3d4e0758eac825b5a75e9b66" exitCode=0 Mar 19 09:20:04.410895 master-0 kubenswrapper[7457]: I0319 09:20:04.410849 7457 manager.go:324] Recovery completed Mar 19 09:20:04.434424 master-0 kubenswrapper[7457]: E0319 09:20:04.434369 7457 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 09:20:04.439853 master-0 kubenswrapper[7457]: I0319 09:20:04.439823 7457 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 09:20:04.439853 master-0 kubenswrapper[7457]: I0319 09:20:04.439843 7457 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 09:20:04.439950 master-0 kubenswrapper[7457]: I0319 09:20:04.439862 7457 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:20:04.440114 master-0 kubenswrapper[7457]: I0319 09:20:04.440088 7457 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 19 09:20:04.440151 master-0 kubenswrapper[7457]: I0319 09:20:04.440110 7457 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 19 09:20:04.440151 master-0 kubenswrapper[7457]: I0319 09:20:04.440137 7457 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 19 09:20:04.440151 master-0 kubenswrapper[7457]: I0319 09:20:04.440144 7457 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 19 09:20:04.440222 master-0 kubenswrapper[7457]: I0319 09:20:04.440152 7457 policy_none.go:49] "None policy: Start" Mar 19 09:20:04.441598 master-0 kubenswrapper[7457]: I0319 09:20:04.441553 7457 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 09:20:04.441598 master-0 kubenswrapper[7457]: I0319 09:20:04.441599 7457 state_mem.go:35] "Initializing new in-memory state store" Mar 19 09:20:04.441868 master-0 kubenswrapper[7457]: I0319 09:20:04.441845 7457 state_mem.go:75] "Updated machine memory state" Mar 19 09:20:04.441868 master-0 kubenswrapper[7457]: I0319 09:20:04.441861 7457 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 19 09:20:04.451026 master-0 kubenswrapper[7457]: I0319 09:20:04.450987 7457 manager.go:334] "Starting Device Plugin manager" Mar 19 09:20:04.451026 master-0 kubenswrapper[7457]: I0319 09:20:04.451027 7457 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 09:20:04.451218 master-0 kubenswrapper[7457]: I0319 09:20:04.451041 7457 server.go:79] "Starting device plugin registration server" Mar 19 09:20:04.451552 master-0 kubenswrapper[7457]: I0319 09:20:04.451503 7457 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 09:20:04.451618 master-0 kubenswrapper[7457]: I0319 09:20:04.451521 7457 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 09:20:04.452352 master-0 kubenswrapper[7457]: I0319 09:20:04.452324 7457 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 09:20:04.452419 master-0 kubenswrapper[7457]: I0319 09:20:04.452402 7457 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 09:20:04.452419 master-0 kubenswrapper[7457]: I0319 09:20:04.452412 7457 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 09:20:04.551876 master-0 kubenswrapper[7457]: I0319 09:20:04.551790 7457 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:20:04.554316 master-0 kubenswrapper[7457]: I0319 09:20:04.554268 7457 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:20:04.554316 master-0 kubenswrapper[7457]: I0319 09:20:04.554318 7457 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:20:04.554470 master-0 kubenswrapper[7457]: I0319 09:20:04.554328 7457 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:20:04.554470 master-0 kubenswrapper[7457]: I0319 09:20:04.554350 7457 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:20:04.635104 master-0 kubenswrapper[7457]: I0319 09:20:04.634877 7457 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 09:20:04.636081 master-0 kubenswrapper[7457]: I0319 09:20:04.635939 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"a74f0437d5a92c82edd9e58f193503c363594aaca67bff5a5ae6fcd1a5a28477"} Mar 19 09:20:04.636081 master-0 kubenswrapper[7457]: I0319 09:20:04.636064 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5"} Mar 19 09:20:04.636281 master-0 kubenswrapper[7457]: I0319 09:20:04.636089 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"382712d4a8a720b54161d083c15e892932ef38c413a22bb647480e2f84ff33a9"} Mar 19 09:20:04.636281 master-0 kubenswrapper[7457]: I0319 09:20:04.636117 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"628162d008ef66056da78d4bcff9fb80227ffcc627a246a21dbba2cd871accd4"} Mar 19 09:20:04.636281 master-0 kubenswrapper[7457]: I0319 09:20:04.636148 7457 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ee23309cc49a7d14cc3f6a92bd46ad644b4ffd9ef72d1521784109d325534ba" Mar 19 09:20:04.636281 master-0 kubenswrapper[7457]: I0319 09:20:04.636203 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0"} Mar 19 09:20:04.636281 master-0 kubenswrapper[7457]: I0319 09:20:04.636221 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34"} Mar 19 09:20:04.636281 master-0 kubenswrapper[7457]: I0319 09:20:04.636240 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerDied","Data":"5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8"} Mar 19 09:20:04.636281 master-0 kubenswrapper[7457]: I0319 09:20:04.636259 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"4a1c72bc9f1c7efb1bcafcf4f7660e88081a0397f913b23e1285005ab7524d43"} Mar 19 09:20:04.636281 master-0 kubenswrapper[7457]: I0319 09:20:04.636278 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"9a59b0cbe8ea8fa4b17a290e74267cd3c1f43f118142de7e624d510bbb389da7"} Mar 19 09:20:04.636862 master-0 kubenswrapper[7457]: I0319 09:20:04.636299 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"205c73b1ea5a301df50c88c2833b1992d29a39f06232166d5125d802ffe3e979"} Mar 19 09:20:04.636862 master-0 kubenswrapper[7457]: I0319 09:20:04.636329 7457 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea06326f75dbe8dd7c60652c7838fe0eb8d997984652bd4f5b739f7370b57187" Mar 19 09:20:04.636862 master-0 kubenswrapper[7457]: I0319 09:20:04.636357 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538"} Mar 19 09:20:04.636862 master-0 kubenswrapper[7457]: I0319 09:20:04.636380 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98"} Mar 19 09:20:04.636862 master-0 kubenswrapper[7457]: I0319 09:20:04.636402 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"ff9134bcfbd7c54799a9cf15d6a97a57adcccc8ff7840ea6e6628d638489256c"} Mar 19 09:20:04.636862 master-0 kubenswrapper[7457]: I0319 09:20:04.636426 7457 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2724f078765cc41b21ea464b50fe169d860dc07093801eacc92a75b30e3593f5" Mar 19 09:20:04.639927 master-0 kubenswrapper[7457]: I0319 09:20:04.637351 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"792d2b5907d7be3b52add934725c063cf367a575639846dbd622e4989463bf6d"} Mar 19 09:20:04.640125 master-0 kubenswrapper[7457]: I0319 09:20:04.639938 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"ff92ed2afc9866bcdca7010b112a4b7e2fe7402710ba37be20aa1e6f3111dc9b"} Mar 19 09:20:04.640125 master-0 kubenswrapper[7457]: I0319 09:20:04.639970 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"e5e1897ddbf62a1e1975ee8d4b56ad3a8cd0b0cf3d4e0758eac825b5a75e9b66"} Mar 19 09:20:04.640125 master-0 kubenswrapper[7457]: I0319 09:20:04.639992 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"5f0b7606a412dcca4dd370553910b12ad443e3587ee9a8d70a1100b889c51bbc"} Mar 19 09:20:05.018812 master-0 kubenswrapper[7457]: I0319 09:20:05.018756 7457 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 09:20:05.119374 master-0 kubenswrapper[7457]: I0319 09:20:05.119264 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:20:05.119374 master-0 kubenswrapper[7457]: I0319 09:20:05.119351 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.119644 master-0 kubenswrapper[7457]: I0319 09:20:05.119475 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.119644 master-0 kubenswrapper[7457]: I0319 09:20:05.119559 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.119644 master-0 kubenswrapper[7457]: I0319 09:20:05.119588 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.119644 master-0 kubenswrapper[7457]: I0319 09:20:05.119610 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:20:05.119644 master-0 kubenswrapper[7457]: I0319 09:20:05.119629 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.119784 master-0 kubenswrapper[7457]: I0319 09:20:05.119650 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.119784 master-0 kubenswrapper[7457]: I0319 09:20:05.119688 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:20:05.119784 master-0 kubenswrapper[7457]: I0319 09:20:05.119715 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:20:05.119784 master-0 kubenswrapper[7457]: I0319 09:20:05.119737 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:20:05.119784 master-0 kubenswrapper[7457]: I0319 09:20:05.119775 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.119909 master-0 kubenswrapper[7457]: I0319 09:20:05.119797 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.119909 master-0 kubenswrapper[7457]: I0319 09:20:05.119817 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.120063 master-0 kubenswrapper[7457]: I0319 09:20:05.119968 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.120102 master-0 kubenswrapper[7457]: I0319 09:20:05.120081 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.120205 master-0 kubenswrapper[7457]: I0319 09:20:05.120105 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:20:05.220834 master-0 kubenswrapper[7457]: I0319 09:20:05.220763 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.220834 master-0 kubenswrapper[7457]: I0319 09:20:05.220822 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.220834 master-0 kubenswrapper[7457]: I0319 09:20:05.220839 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.220853 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.220951 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221049 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221072 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221088 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221106 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221123 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221138 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221154 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221168 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221186 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221201 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221215 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221231 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221246 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221274 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221298 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221319 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221394 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221439 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221466 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221489 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221513 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221555 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221578 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221599 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221619 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221641 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221657 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221674 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.221897 master-0 kubenswrapper[7457]: I0319 09:20:05.221692 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.266251 master-0 kubenswrapper[7457]: I0319 09:20:05.266206 7457 apiserver.go:52] "Watching apiserver" Mar 19 09:20:05.280167 master-0 kubenswrapper[7457]: I0319 09:20:05.280062 7457 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:20:05.280648 master-0 kubenswrapper[7457]: I0319 09:20:05.280610 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-kwrpk","openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59","openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl","openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6","openshift-network-node-identity/network-node-identity-slmgx","kube-system/bootstrap-kube-controller-manager-master-0","openshift-network-operator/network-operator-7bd846bfc4-b4d28","openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk","openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9","openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7","openshift-dns-operator/dns-operator-9c5679d8f-cbw4r","openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh","openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj","openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5","openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7","openshift-ovn-kubernetes/ovnkube-node-vcxjs","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm","openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898","openshift-etcd/etcd-master-0-master-0","openshift-multus/multus-bzdzd","openshift-multus/network-metrics-daemon-nq9vs","openshift-network-diagnostics/network-check-target-4s5vc","openshift-cluster-version/cluster-version-operator-56d8475767-prd2q","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d","openshift-network-operator/iptables-alerter-qfc76","openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd","kube-system/bootstrap-kube-scheduler-master-0","openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9","openshift-marketplace/marketplace-operator-89ccd998f-gxznr","openshift-multus/multus-additional-cni-plugins-8kv6s","openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692"] Mar 19 09:20:05.283399 master-0 kubenswrapper[7457]: I0319 09:20:05.281591 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:20:05.283399 master-0 kubenswrapper[7457]: I0319 09:20:05.281750 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:05.283399 master-0 kubenswrapper[7457]: I0319 09:20:05.281757 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:05.283399 master-0 kubenswrapper[7457]: I0319 09:20:05.281829 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:20:05.283399 master-0 kubenswrapper[7457]: I0319 09:20:05.281882 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.283399 master-0 kubenswrapper[7457]: I0319 09:20:05.282314 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:20:05.283399 master-0 kubenswrapper[7457]: I0319 09:20:05.282680 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:05.283909 master-0 kubenswrapper[7457]: I0319 09:20:05.283884 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:05.284023 master-0 kubenswrapper[7457]: I0319 09:20:05.283958 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:05.284720 master-0 kubenswrapper[7457]: I0319 09:20:05.284685 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:20:05.285029 master-0 kubenswrapper[7457]: I0319 09:20:05.284963 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:20:05.285029 master-0 kubenswrapper[7457]: I0319 09:20:05.285021 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:20:05.285855 master-0 kubenswrapper[7457]: I0319 09:20:05.285824 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:20:05.286500 master-0 kubenswrapper[7457]: I0319 09:20:05.286472 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:20:05.290203 master-0 kubenswrapper[7457]: I0319 09:20:05.290156 7457 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 19 09:20:05.291901 master-0 kubenswrapper[7457]: I0319 09:20:05.291865 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:20:05.292100 master-0 kubenswrapper[7457]: I0319 09:20:05.292070 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:20:05.292100 master-0 kubenswrapper[7457]: I0319 09:20:05.292080 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:20:05.292315 master-0 kubenswrapper[7457]: I0319 09:20:05.292285 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:20:05.292669 master-0 kubenswrapper[7457]: I0319 09:20:05.292322 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:20:05.292669 master-0 kubenswrapper[7457]: I0319 09:20:05.292440 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:20:05.292669 master-0 kubenswrapper[7457]: I0319 09:20:05.292651 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:20:05.293023 master-0 kubenswrapper[7457]: I0319 09:20:05.292990 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:20:05.293140 master-0 kubenswrapper[7457]: I0319 09:20:05.293109 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:20:05.293252 master-0 kubenswrapper[7457]: I0319 09:20:05.293111 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:20:05.293325 master-0 kubenswrapper[7457]: I0319 09:20:05.293164 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:20:05.293429 master-0 kubenswrapper[7457]: I0319 09:20:05.293397 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:20:05.293645 master-0 kubenswrapper[7457]: I0319 09:20:05.293402 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:20:05.293857 master-0 kubenswrapper[7457]: I0319 09:20:05.293823 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:20:05.293940 master-0 kubenswrapper[7457]: I0319 09:20:05.293882 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:20:05.293940 master-0 kubenswrapper[7457]: I0319 09:20:05.293909 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:20:05.293940 master-0 kubenswrapper[7457]: I0319 09:20:05.293924 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:20:05.293940 master-0 kubenswrapper[7457]: I0319 09:20:05.293939 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:20:05.294145 master-0 kubenswrapper[7457]: I0319 09:20:05.294048 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:20:05.294145 master-0 kubenswrapper[7457]: I0319 09:20:05.294080 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:20:05.294145 master-0 kubenswrapper[7457]: I0319 09:20:05.294100 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 09:20:05.294303 master-0 kubenswrapper[7457]: I0319 09:20:05.294174 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:20:05.294303 master-0 kubenswrapper[7457]: I0319 09:20:05.294271 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:20:05.294414 master-0 kubenswrapper[7457]: I0319 09:20:05.294332 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:20:05.294414 master-0 kubenswrapper[7457]: I0319 09:20:05.294341 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 09:20:05.294517 master-0 kubenswrapper[7457]: I0319 09:20:05.294423 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:20:05.294517 master-0 kubenswrapper[7457]: I0319 09:20:05.294442 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:20:05.294654 master-0 kubenswrapper[7457]: I0319 09:20:05.294575 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:20:05.294741 master-0 kubenswrapper[7457]: I0319 09:20:05.294701 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:20:05.294802 master-0 kubenswrapper[7457]: I0319 09:20:05.294739 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:20:05.294857 master-0 kubenswrapper[7457]: I0319 09:20:05.294851 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:20:05.294936 master-0 kubenswrapper[7457]: I0319 09:20:05.294908 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:20:05.294993 master-0 kubenswrapper[7457]: I0319 09:20:05.294957 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:20:05.295078 master-0 kubenswrapper[7457]: I0319 09:20:05.295048 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:20:05.295125 master-0 kubenswrapper[7457]: I0319 09:20:05.295094 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:20:05.295267 master-0 kubenswrapper[7457]: I0319 09:20:05.295248 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:20:05.295368 master-0 kubenswrapper[7457]: I0319 09:20:05.295338 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:20:05.295421 master-0 kubenswrapper[7457]: I0319 09:20:05.295268 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 09:20:05.295506 master-0 kubenswrapper[7457]: I0319 09:20:05.295344 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:20:05.299301 master-0 kubenswrapper[7457]: I0319 09:20:05.299264 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:20:05.299617 master-0 kubenswrapper[7457]: I0319 09:20:05.299595 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:20:05.299813 master-0 kubenswrapper[7457]: I0319 09:20:05.299791 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:20:05.300164 master-0 kubenswrapper[7457]: I0319 09:20:05.300144 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 09:20:05.300263 master-0 kubenswrapper[7457]: I0319 09:20:05.300244 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:20:05.304814 master-0 kubenswrapper[7457]: I0319 09:20:05.304761 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 09:20:05.305037 master-0 kubenswrapper[7457]: I0319 09:20:05.305004 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:20:05.305320 master-0 kubenswrapper[7457]: I0319 09:20:05.305297 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:20:05.306911 master-0 kubenswrapper[7457]: I0319 09:20:05.306391 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:20:05.306911 master-0 kubenswrapper[7457]: I0319 09:20:05.306879 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:20:05.307818 master-0 kubenswrapper[7457]: I0319 09:20:05.307789 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:20:05.309003 master-0 kubenswrapper[7457]: I0319 09:20:05.307910 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:20:05.309003 master-0 kubenswrapper[7457]: I0319 09:20:05.308447 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 09:20:05.309003 master-0 kubenswrapper[7457]: I0319 09:20:05.308505 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:20:05.309393 master-0 kubenswrapper[7457]: I0319 09:20:05.309370 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:20:05.312445 master-0 kubenswrapper[7457]: I0319 09:20:05.312413 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:20:05.312659 master-0 kubenswrapper[7457]: I0319 09:20:05.312636 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 09:20:05.313365 master-0 kubenswrapper[7457]: I0319 09:20:05.313341 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:20:05.313499 master-0 kubenswrapper[7457]: I0319 09:20:05.313429 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 09:20:05.313499 master-0 kubenswrapper[7457]: I0319 09:20:05.313459 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 09:20:05.313675 master-0 kubenswrapper[7457]: I0319 09:20:05.313544 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:20:05.313675 master-0 kubenswrapper[7457]: I0319 09:20:05.313592 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 09:20:05.313675 master-0 kubenswrapper[7457]: I0319 09:20:05.313620 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 09:20:05.313899 master-0 kubenswrapper[7457]: I0319 09:20:05.313744 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:20:05.313899 master-0 kubenswrapper[7457]: I0319 09:20:05.313842 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 09:20:05.313899 master-0 kubenswrapper[7457]: I0319 09:20:05.313876 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 09:20:05.313899 master-0 kubenswrapper[7457]: I0319 09:20:05.313881 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:20:05.314200 master-0 kubenswrapper[7457]: I0319 09:20:05.313929 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:20:05.314200 master-0 kubenswrapper[7457]: I0319 09:20:05.313953 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:20:05.314200 master-0 kubenswrapper[7457]: I0319 09:20:05.313969 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:20:05.314200 master-0 kubenswrapper[7457]: I0319 09:20:05.314039 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:20:05.314200 master-0 kubenswrapper[7457]: I0319 09:20:05.314139 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:20:05.314200 master-0 kubenswrapper[7457]: I0319 09:20:05.314192 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:20:05.314718 master-0 kubenswrapper[7457]: I0319 09:20:05.314212 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:20:05.314718 master-0 kubenswrapper[7457]: I0319 09:20:05.314254 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:20:05.314718 master-0 kubenswrapper[7457]: I0319 09:20:05.314317 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:20:05.314718 master-0 kubenswrapper[7457]: I0319 09:20:05.314303 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:20:05.314718 master-0 kubenswrapper[7457]: I0319 09:20:05.314257 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:20:05.314718 master-0 kubenswrapper[7457]: I0319 09:20:05.314373 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:20:05.314718 master-0 kubenswrapper[7457]: I0319 09:20:05.314505 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:20:05.314718 master-0 kubenswrapper[7457]: I0319 09:20:05.314539 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:20:05.314718 master-0 kubenswrapper[7457]: I0319 09:20:05.314581 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:20:05.314718 master-0 kubenswrapper[7457]: I0319 09:20:05.314654 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:20:05.314718 master-0 kubenswrapper[7457]: I0319 09:20:05.314673 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:20:05.314718 master-0 kubenswrapper[7457]: I0319 09:20:05.314707 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:20:05.318147 master-0 kubenswrapper[7457]: I0319 09:20:05.314757 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:20:05.318147 master-0 kubenswrapper[7457]: I0319 09:20:05.314822 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:20:05.318147 master-0 kubenswrapper[7457]: I0319 09:20:05.314851 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:20:05.318147 master-0 kubenswrapper[7457]: I0319 09:20:05.314921 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:20:05.318147 master-0 kubenswrapper[7457]: I0319 09:20:05.315004 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:20:05.318147 master-0 kubenswrapper[7457]: I0319 09:20:05.315019 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:20:05.318147 master-0 kubenswrapper[7457]: I0319 09:20:05.316094 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:20:05.318147 master-0 kubenswrapper[7457]: I0319 09:20:05.316257 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:20:05.318147 master-0 kubenswrapper[7457]: I0319 09:20:05.317128 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:20:05.318147 master-0 kubenswrapper[7457]: I0319 09:20:05.317497 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 09:20:05.319513 master-0 kubenswrapper[7457]: I0319 09:20:05.319470 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:20:05.319731 master-0 kubenswrapper[7457]: I0319 09:20:05.319690 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:20:05.320566 master-0 kubenswrapper[7457]: I0319 09:20:05.320508 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:20:05.321891 master-0 kubenswrapper[7457]: I0319 09:20:05.321808 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:20:05.321990 master-0 kubenswrapper[7457]: I0319 09:20:05.321899 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-config\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:20:05.321990 master-0 kubenswrapper[7457]: I0319 09:20:05.321930 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/16d2930b-486b-492d-983e-c6702d8f53a7-kube-api-access-h5hk6\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:20:05.321990 master-0 kubenswrapper[7457]: I0319 09:20:05.321957 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-k8s-cni-cncf-io\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.322213 master-0 kubenswrapper[7457]: I0319 09:20:05.322149 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c247d991-809e-46b6-9617-9b05007b7560-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:05.322213 master-0 kubenswrapper[7457]: I0319 09:20:05.322185 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-system-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.322514 master-0 kubenswrapper[7457]: I0319 09:20:05.322440 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-node-log\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.322514 master-0 kubenswrapper[7457]: I0319 09:20:05.322498 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-env-overrides\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.322699 master-0 kubenswrapper[7457]: I0319 09:20:05.322537 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvnb9\" (UniqueName: \"kubernetes.io/projected/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-kube-api-access-lvnb9\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.322699 master-0 kubenswrapper[7457]: I0319 09:20:05.322567 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f65184f-8fc2-4656-8776-a3b962aa1f5d-iptables-alerter-script\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:20:05.322699 master-0 kubenswrapper[7457]: I0319 09:20:05.322664 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-netns\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.322699 master-0 kubenswrapper[7457]: I0319 09:20:05.322691 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.322951 master-0 kubenswrapper[7457]: I0319 09:20:05.322736 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.322951 master-0 kubenswrapper[7457]: I0319 09:20:05.322885 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-env-overrides\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.322951 master-0 kubenswrapper[7457]: I0319 09:20:05.322926 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c247d991-809e-46b6-9617-9b05007b7560-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:05.322951 master-0 kubenswrapper[7457]: I0319 09:20:05.322872 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.323183 master-0 kubenswrapper[7457]: I0319 09:20:05.322977 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-serving-cert\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:20:05.323183 master-0 kubenswrapper[7457]: I0319 09:20:05.322998 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:05.323183 master-0 kubenswrapper[7457]: I0319 09:20:05.323015 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-env-overrides\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:20:05.323183 master-0 kubenswrapper[7457]: I0319 09:20:05.323058 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bdnt\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-kube-api-access-6bdnt\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:05.323379 master-0 kubenswrapper[7457]: I0319 09:20:05.323066 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-config\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:20:05.323379 master-0 kubenswrapper[7457]: I0319 09:20:05.323251 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-multus-certs\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.323379 master-0 kubenswrapper[7457]: I0319 09:20:05.323215 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.323499 master-0 kubenswrapper[7457]: I0319 09:20:05.323389 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-serving-cert\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:20:05.323499 master-0 kubenswrapper[7457]: I0319 09:20:05.323388 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b333a1e-2a7f-423a-8b40-99f30c89f740-config\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:20:05.323665 master-0 kubenswrapper[7457]: I0319 09:20:05.323582 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-env-overrides\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:20:05.323665 master-0 kubenswrapper[7457]: I0319 09:20:05.323649 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j65pb\" (UniqueName: \"kubernetes.io/projected/4f65184f-8fc2-4656-8776-a3b962aa1f5d-kube-api-access-j65pb\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:20:05.323765 master-0 kubenswrapper[7457]: I0319 09:20:05.323713 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c75102-6790-4ed3-84da-61c3611186f8-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:20:05.323765 master-0 kubenswrapper[7457]: I0319 09:20:05.323748 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b333a1e-2a7f-423a-8b40-99f30c89f740-config\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:20:05.323856 master-0 kubenswrapper[7457]: I0319 09:20:05.323765 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-slash\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.323958 master-0 kubenswrapper[7457]: I0319 09:20:05.323814 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8527f5cd-2992-44be-90b8-e9086cedf46e-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:20:05.324013 master-0 kubenswrapper[7457]: I0319 09:20:05.323985 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.324013 master-0 kubenswrapper[7457]: I0319 09:20:05.323933 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c75102-6790-4ed3-84da-61c3611186f8-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:20:05.324098 master-0 kubenswrapper[7457]: I0319 09:20:05.324031 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:05.324098 master-0 kubenswrapper[7457]: I0319 09:20:05.324064 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8527f5cd-2992-44be-90b8-e9086cedf46e-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:20:05.324098 master-0 kubenswrapper[7457]: I0319 09:20:05.324085 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9076d131-644a-4332-8a70-34f6b0f71575-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:05.324213 master-0 kubenswrapper[7457]: I0319 09:20:05.324135 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.324213 master-0 kubenswrapper[7457]: I0319 09:20:05.324184 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-bound-sa-token\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:05.324492 master-0 kubenswrapper[7457]: I0319 09:20:05.324456 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hnvh\" (UniqueName: \"kubernetes.io/projected/4abcf2ea-50f5-4d62-8a23-583438e5b451-kube-api-access-2hnvh\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:20:05.326226 master-0 kubenswrapper[7457]: I0319 09:20:05.326196 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:20:05.326791 master-0 kubenswrapper[7457]: I0319 09:20:05.326500 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:20:05.326791 master-0 kubenswrapper[7457]: I0319 09:20:05.326642 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 09:20:05.327233 master-0 kubenswrapper[7457]: I0319 09:20:05.327192 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxjqg\" (UniqueName: \"kubernetes.io/projected/979d4d12-a560-4309-a1d3-cbebe853e8ea-kube-api-access-rxjqg\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.327407 master-0 kubenswrapper[7457]: I0319 09:20:05.327290 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-daemon-config\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.327468 master-0 kubenswrapper[7457]: I0319 09:20:05.327449 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:20:05.327516 master-0 kubenswrapper[7457]: I0319 09:20:05.327498 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7fae040-28fa-4d97-8482-fd0dd12cc921-serving-cert\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:20:05.327583 master-0 kubenswrapper[7457]: I0319 09:20:05.327552 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-var-lib-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.327617 master-0 kubenswrapper[7457]: I0319 09:20:05.327586 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8527f5cd-2992-44be-90b8-e9086cedf46e-config\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:20:05.327654 master-0 kubenswrapper[7457]: I0319 09:20:05.327621 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-conf-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.327696 master-0 kubenswrapper[7457]: I0319 09:20:05.327664 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-daemon-config\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.327767 master-0 kubenswrapper[7457]: I0319 09:20:05.327746 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6tp5\" (UniqueName: \"kubernetes.io/projected/9a6c1523-e77c-4aac-814c-05d41215c42f-kube-api-access-m6tp5\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:20:05.327918 master-0 kubenswrapper[7457]: I0319 09:20:05.327828 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxfs\" (UniqueName: \"kubernetes.io/projected/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-kube-api-access-djxfs\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:05.327918 master-0 kubenswrapper[7457]: I0319 09:20:05.327892 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7fae040-28fa-4d97-8482-fd0dd12cc921-serving-cert\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:20:05.327918 master-0 kubenswrapper[7457]: I0319 09:20:05.327897 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:20:05.328035 master-0 kubenswrapper[7457]: I0319 09:20:05.327950 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.328071 master-0 kubenswrapper[7457]: I0319 09:20:05.328027 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqwbw\" (UniqueName: \"kubernetes.io/projected/e7fae040-28fa-4d97-8482-fd0dd12cc921-kube-api-access-jqwbw\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:20:05.328104 master-0 kubenswrapper[7457]: I0319 09:20:05.328069 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-kubelet\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.328192 master-0 kubenswrapper[7457]: I0319 09:20:05.328118 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5f5s\" (UniqueName: \"kubernetes.io/projected/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-kube-api-access-w5f5s\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:05.328221 master-0 kubenswrapper[7457]: I0319 09:20:05.328200 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d664acc4-ec4f-4078-ae93-404a14ea18fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:20:05.328344 master-0 kubenswrapper[7457]: I0319 09:20:05.328324 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.328419 master-0 kubenswrapper[7457]: I0319 09:20:05.328387 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51b88818-5108-40db-90c8-4f2e7198959e-kube-api-access\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.328450 master-0 kubenswrapper[7457]: I0319 09:20:05.328433 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8527f5cd-2992-44be-90b8-e9086cedf46e-config\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:20:05.328589 master-0 kubenswrapper[7457]: I0319 09:20:05.328564 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-script-lib\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.328640 master-0 kubenswrapper[7457]: I0319 09:20:05.328599 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-os-release\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.328761 master-0 kubenswrapper[7457]: I0319 09:20:05.328741 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:20:05.328820 master-0 kubenswrapper[7457]: I0319 09:20:05.328774 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f65184f-8fc2-4656-8776-a3b962aa1f5d-host-slash\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:20:05.328860 master-0 kubenswrapper[7457]: I0319 09:20:05.328842 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d664acc4-ec4f-4078-ae93-404a14ea18fc-config\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:20:05.328897 master-0 kubenswrapper[7457]: I0319 09:20:05.328884 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:20:05.328930 master-0 kubenswrapper[7457]: I0319 09:20:05.328911 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:20:05.329031 master-0 kubenswrapper[7457]: I0319 09:20:05.329007 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-script-lib\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.329088 master-0 kubenswrapper[7457]: I0319 09:20:05.329069 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9blbc\" (UniqueName: \"kubernetes.io/projected/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-kube-api-access-9blbc\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:20:05.329157 master-0 kubenswrapper[7457]: I0319 09:20:05.329128 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:05.329192 master-0 kubenswrapper[7457]: I0319 09:20:05.329177 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-cnibin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.329248 master-0 kubenswrapper[7457]: I0319 09:20:05.329209 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:20:05.329248 master-0 kubenswrapper[7457]: I0319 09:20:05.329225 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-socket-dir-parent\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.329248 master-0 kubenswrapper[7457]: I0319 09:20:05.329194 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d664acc4-ec4f-4078-ae93-404a14ea18fc-config\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:20:05.329355 master-0 kubenswrapper[7457]: I0319 09:20:05.329336 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-env-overrides\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:20:05.329385 master-0 kubenswrapper[7457]: I0319 09:20:05.329365 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:20:05.329413 master-0 kubenswrapper[7457]: I0319 09:20:05.329388 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:05.329413 master-0 kubenswrapper[7457]: I0319 09:20:05.329407 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-system-cni-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.329469 master-0 kubenswrapper[7457]: I0319 09:20:05.329425 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:20:05.329469 master-0 kubenswrapper[7457]: I0319 09:20:05.329445 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-config\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:20:05.329469 master-0 kubenswrapper[7457]: I0319 09:20:05.329466 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hw6b\" (UniqueName: \"kubernetes.io/projected/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-kube-api-access-8hw6b\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:20:05.329584 master-0 kubenswrapper[7457]: I0319 09:20:05.329489 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtw68\" (UniqueName: \"kubernetes.io/projected/41659a48-5eea-41cd-8b2a-b683dc15cc11-kube-api-access-jtw68\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:20:05.329584 master-0 kubenswrapper[7457]: I0319 09:20:05.329512 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-etc-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.329584 master-0 kubenswrapper[7457]: I0319 09:20:05.329552 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgtgw\" (UniqueName: \"kubernetes.io/projected/8beda3a0-a653-4810-b3f2-d25badb21ab1-kube-api-access-tgtgw\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:20:05.329584 master-0 kubenswrapper[7457]: I0319 09:20:05.329569 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-systemd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.329584 master-0 kubenswrapper[7457]: I0319 09:20:05.329584 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-bin\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.329790 master-0 kubenswrapper[7457]: I0319 09:20:05.329599 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:20:05.329790 master-0 kubenswrapper[7457]: I0319 09:20:05.329617 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/4abcf2ea-50f5-4d62-8a23-583438e5b451-host-etc-kube\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:20:05.329790 master-0 kubenswrapper[7457]: I0319 09:20:05.329646 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4hqj\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-kube-api-access-v4hqj\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:05.329790 master-0 kubenswrapper[7457]: I0319 09:20:05.329663 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:20:05.329790 master-0 kubenswrapper[7457]: I0319 09:20:05.329682 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-log-socket\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.329790 master-0 kubenswrapper[7457]: I0319 09:20:05.329730 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43cb2a3b-40e2-45ee-894a-6c833ee17efd-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:20:05.329790 master-0 kubenswrapper[7457]: I0319 09:20:05.329746 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d664acc4-ec4f-4078-ae93-404a14ea18fc-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:20:05.329790 master-0 kubenswrapper[7457]: I0319 09:20:05.329765 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-netns\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.329790 master-0 kubenswrapper[7457]: I0319 09:20:05.329771 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-env-overrides\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:20:05.329790 master-0 kubenswrapper[7457]: I0319 09:20:05.329781 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v88k\" (UniqueName: \"kubernetes.io/projected/259794ab-d027-497a-b08e-5a6d79057668-kube-api-access-6v88k\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:20:05.330061 master-0 kubenswrapper[7457]: I0319 09:20:05.329830 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:05.330061 master-0 kubenswrapper[7457]: I0319 09:20:05.329888 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d12dab-1215-4c1f-a9f5-27ea7174d308-trusted-ca\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:05.330121 master-0 kubenswrapper[7457]: I0319 09:20:05.330072 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:20:05.330179 master-0 kubenswrapper[7457]: I0319 09:20:05.330143 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-config\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:20:05.330420 master-0 kubenswrapper[7457]: I0319 09:20:05.330393 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d664acc4-ec4f-4078-ae93-404a14ea18fc-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:20:05.330452 master-0 kubenswrapper[7457]: I0319 09:20:05.330419 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43cb2a3b-40e2-45ee-894a-6c833ee17efd-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:20:05.330452 master-0 kubenswrapper[7457]: I0319 09:20:05.330436 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-netd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.330512 master-0 kubenswrapper[7457]: I0319 09:20:05.330467 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c75102-6790-4ed3-84da-61c3611186f8-config\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:20:05.330568 master-0 kubenswrapper[7457]: I0319 09:20:05.330554 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-serving-cert\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:20:05.330685 master-0 kubenswrapper[7457]: I0319 09:20:05.330657 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7t5\" (UniqueName: \"kubernetes.io/projected/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-kube-api-access-vl7t5\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:20:05.330717 master-0 kubenswrapper[7457]: I0319 09:20:05.330686 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:20:05.330717 master-0 kubenswrapper[7457]: I0319 09:20:05.330690 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:20:05.330773 master-0 kubenswrapper[7457]: I0319 09:20:05.330721 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-multus\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.330773 master-0 kubenswrapper[7457]: I0319 09:20:05.330742 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b333a1e-2a7f-423a-8b40-99f30c89f740-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:20:05.330773 master-0 kubenswrapper[7457]: I0319 09:20:05.330750 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c75102-6790-4ed3-84da-61c3611186f8-config\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:20:05.330853 master-0 kubenswrapper[7457]: I0319 09:20:05.330759 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvd6f\" (UniqueName: \"kubernetes.io/projected/3b333a1e-2a7f-423a-8b40-99f30c89f740-kube-api-access-xvd6f\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:20:05.330853 master-0 kubenswrapper[7457]: I0319 09:20:05.330805 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clpb5\" (UniqueName: \"kubernetes.io/projected/13072c08-c77c-4170-9ebe-98d63968747b-kube-api-access-clpb5\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:20:05.330853 master-0 kubenswrapper[7457]: I0319 09:20:05.330845 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-cnibin\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.330853 master-0 kubenswrapper[7457]: I0319 09:20:05.330848 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-serving-cert\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:20:05.330958 master-0 kubenswrapper[7457]: I0319 09:20:05.330872 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:20:05.330958 master-0 kubenswrapper[7457]: I0319 09:20:05.330900 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:05.330958 master-0 kubenswrapper[7457]: I0319 09:20:05.330917 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d12dab-1215-4c1f-a9f5-27ea7174d308-trusted-ca\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:05.331239 master-0 kubenswrapper[7457]: I0319 09:20:05.330962 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-client\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:20:05.331239 master-0 kubenswrapper[7457]: I0319 09:20:05.331026 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-ovn\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.331239 master-0 kubenswrapper[7457]: I0319 09:20:05.331045 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b333a1e-2a7f-423a-8b40-99f30c89f740-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:20:05.331319 master-0 kubenswrapper[7457]: I0319 09:20:05.331272 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3b50118d-f7c2-4bff-aca0-5c6623819baf-operand-assets\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:20:05.331347 master-0 kubenswrapper[7457]: I0319 09:20:05.331328 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0c75102-6790-4ed3-84da-61c3611186f8-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:20:05.331394 master-0 kubenswrapper[7457]: I0319 09:20:05.331355 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vcf6\" (UniqueName: \"kubernetes.io/projected/9076d131-644a-4332-8a70-34f6b0f71575-kube-api-access-2vcf6\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:05.331434 master-0 kubenswrapper[7457]: I0319 09:20:05.331406 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cb2a3b-40e2-45ee-894a-6c833ee17efd-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:20:05.331507 master-0 kubenswrapper[7457]: I0319 09:20:05.331477 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6dq\" (UniqueName: \"kubernetes.io/projected/43cb2a3b-40e2-45ee-894a-6c833ee17efd-kube-api-access-vf6dq\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:20:05.331558 master-0 kubenswrapper[7457]: I0319 09:20:05.331508 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.331558 master-0 kubenswrapper[7457]: I0319 09:20:05.331545 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.331619 master-0 kubenswrapper[7457]: I0319 09:20:05.331564 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:05.331619 master-0 kubenswrapper[7457]: I0319 09:20:05.331611 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9plst\" (UniqueName: \"kubernetes.io/projected/8e073eb4-67f2-4de7-8848-50da73079dbc-kube-api-access-9plst\") pod \"csi-snapshot-controller-operator-5f5d689c6b-jv8lm\" (UID: \"8e073eb4-67f2-4de7-8848-50da73079dbc\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm" Mar 19 09:20:05.331706 master-0 kubenswrapper[7457]: I0319 09:20:05.331686 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.331743 master-0 kubenswrapper[7457]: I0319 09:20:05.331735 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-etc-kubernetes\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.331777 master-0 kubenswrapper[7457]: I0319 09:20:05.331756 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.331828 master-0 kubenswrapper[7457]: I0319 09:20:05.331805 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-os-release\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.331865 master-0 kubenswrapper[7457]: I0319 09:20:05.331844 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-config\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:20:05.331909 master-0 kubenswrapper[7457]: I0319 09:20:05.331875 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.331942 master-0 kubenswrapper[7457]: I0319 09:20:05.331908 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rqsq\" (UniqueName: \"kubernetes.io/projected/3b50118d-f7c2-4bff-aca0-5c6623819baf-kube-api-access-6rqsq\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:20:05.331942 master-0 kubenswrapper[7457]: I0319 09:20:05.331933 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:05.331995 master-0 kubenswrapper[7457]: I0319 09:20:05.331961 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:20:05.331995 master-0 kubenswrapper[7457]: I0319 09:20:05.331991 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-systemd-units\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.332049 master-0 kubenswrapper[7457]: I0319 09:20:05.332020 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-config\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.332304 master-0 kubenswrapper[7457]: I0319 09:20:05.332273 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:20:05.332337 master-0 kubenswrapper[7457]: I0319 09:20:05.332305 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-webhook-cert\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:20:05.332376 master-0 kubenswrapper[7457]: I0319 09:20:05.332327 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfq74\" (UniqueName: \"kubernetes.io/projected/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-kube-api-access-sfq74\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:20:05.332410 master-0 kubenswrapper[7457]: I0319 09:20:05.332387 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4abcf2ea-50f5-4d62-8a23-583438e5b451-metrics-tls\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:20:05.332494 master-0 kubenswrapper[7457]: I0319 09:20:05.332470 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:20:05.332547 master-0 kubenswrapper[7457]: I0319 09:20:05.332500 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-kubelet\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.332547 master-0 kubenswrapper[7457]: I0319 09:20:05.332534 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:20:05.332609 master-0 kubenswrapper[7457]: I0319 09:20:05.332575 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-ovnkube-identity-cm\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:20:05.332642 master-0 kubenswrapper[7457]: I0319 09:20:05.332614 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51b88818-5108-40db-90c8-4f2e7198959e-service-ca\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.332674 master-0 kubenswrapper[7457]: I0319 09:20:05.332633 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-cni-binary-copy\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.332719 master-0 kubenswrapper[7457]: I0319 09:20:05.332681 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-bin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.332926 master-0 kubenswrapper[7457]: I0319 09:20:05.332890 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-client\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:20:05.333022 master-0 kubenswrapper[7457]: I0319 09:20:05.332963 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b50118d-f7c2-4bff-aca0-5c6623819baf-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:20:05.333022 master-0 kubenswrapper[7457]: I0319 09:20:05.332963 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:05.333186 master-0 kubenswrapper[7457]: I0319 09:20:05.333148 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:20:05.333236 master-0 kubenswrapper[7457]: I0319 09:20:05.333210 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-config\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.333514 master-0 kubenswrapper[7457]: I0319 09:20:05.333479 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cb2a3b-40e2-45ee-894a-6c833ee17efd-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:20:05.333514 master-0 kubenswrapper[7457]: I0319 09:20:05.333274 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3b50118d-f7c2-4bff-aca0-5c6623819baf-operand-assets\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:20:05.333604 master-0 kubenswrapper[7457]: I0319 09:20:05.333536 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-hostroot\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.333631 master-0 kubenswrapper[7457]: I0319 09:20:05.333601 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhzsr\" (UniqueName: \"kubernetes.io/projected/157e3524-eb27-41ca-b49d-2697ee1245ca-kube-api-access-qhzsr\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.333719 master-0 kubenswrapper[7457]: I0319 09:20:05.333649 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.333951 master-0 kubenswrapper[7457]: I0319 09:20:05.333929 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:05.333999 master-0 kubenswrapper[7457]: I0319 09:20:05.333984 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp9jf\" (UniqueName: \"kubernetes.io/projected/8527f5cd-2992-44be-90b8-e9086cedf46e-kube-api-access-qp9jf\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:20:05.334030 master-0 kubenswrapper[7457]: I0319 09:20:05.334019 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:20:05.334236 master-0 kubenswrapper[7457]: I0319 09:20:05.334216 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovn-node-metrics-cert\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.335398 master-0 kubenswrapper[7457]: I0319 09:20:05.334820 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-config\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:20:05.335398 master-0 kubenswrapper[7457]: I0319 09:20:05.334831 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:20:05.335398 master-0 kubenswrapper[7457]: I0319 09:20:05.335002 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-cni-binary-copy\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.335398 master-0 kubenswrapper[7457]: I0319 09:20:05.335052 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:20:05.335398 master-0 kubenswrapper[7457]: I0319 09:20:05.335184 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:20:05.335398 master-0 kubenswrapper[7457]: I0319 09:20:05.335327 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:05.335398 master-0 kubenswrapper[7457]: I0319 09:20:05.335338 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51b88818-5108-40db-90c8-4f2e7198959e-service-ca\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.335835 master-0 kubenswrapper[7457]: I0319 09:20:05.335810 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9076d131-644a-4332-8a70-34f6b0f71575-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:05.336023 master-0 kubenswrapper[7457]: I0319 09:20:05.336006 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-webhook-cert\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:20:05.336944 master-0 kubenswrapper[7457]: I0319 09:20:05.336764 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b50118d-f7c2-4bff-aca0-5c6623819baf-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:20:05.336944 master-0 kubenswrapper[7457]: I0319 09:20:05.336797 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-ovnkube-identity-cm\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:20:05.336944 master-0 kubenswrapper[7457]: I0319 09:20:05.336807 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovn-node-metrics-cert\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.336944 master-0 kubenswrapper[7457]: I0319 09:20:05.336935 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4abcf2ea-50f5-4d62-8a23-583438e5b451-metrics-tls\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:20:05.337104 master-0 kubenswrapper[7457]: I0319 09:20:05.336974 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:20:05.337779 master-0 kubenswrapper[7457]: I0319 09:20:05.337745 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f65184f-8fc2-4656-8776-a3b962aa1f5d-iptables-alerter-script\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:20:05.435194 master-0 kubenswrapper[7457]: I0319 09:20:05.435140 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-systemd-units\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.435194 master-0 kubenswrapper[7457]: I0319 09:20:05.435192 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.435482 master-0 kubenswrapper[7457]: I0319 09:20:05.435227 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-kubelet\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.435482 master-0 kubenswrapper[7457]: I0319 09:20:05.435266 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-bin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.435482 master-0 kubenswrapper[7457]: I0319 09:20:05.435289 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-hostroot\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.435482 master-0 kubenswrapper[7457]: I0319 09:20:05.435319 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:05.435482 master-0 kubenswrapper[7457]: I0319 09:20:05.435355 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:20:05.435482 master-0 kubenswrapper[7457]: I0319 09:20:05.435379 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-system-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.435482 master-0 kubenswrapper[7457]: I0319 09:20:05.435401 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-k8s-cni-cncf-io\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.435482 master-0 kubenswrapper[7457]: I0319 09:20:05.435420 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-node-log\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.435482 master-0 kubenswrapper[7457]: I0319 09:20:05.435447 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.435482 master-0 kubenswrapper[7457]: I0319 09:20:05.435469 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-netns\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435490 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435512 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435559 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-multus-certs\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435589 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435610 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-slash\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435631 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435652 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435699 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-var-lib-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435726 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-conf-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435772 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-kubelet\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435813 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-os-release\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435833 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435860 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f65184f-8fc2-4656-8776-a3b962aa1f5d-host-slash\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435883 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435905 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:05.435940 master-0 kubenswrapper[7457]: I0319 09:20:05.435938 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-cnibin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.435960 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-socket-dir-parent\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.435983 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436006 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436029 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-system-cni-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436097 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-etc-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436126 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-systemd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436147 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-bin\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436170 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436191 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/4abcf2ea-50f5-4d62-8a23-583438e5b451-host-etc-kube\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436221 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-netns\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436241 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-log-socket\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436270 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436291 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436312 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-netd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436340 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-cnibin\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436360 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-multus\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436396 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-ovn\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436447 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436476 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436497 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-etc-kubernetes\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.436511 master-0 kubenswrapper[7457]: I0319 09:20:05.436535 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: I0319 09:20:05.436557 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-os-release\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: I0319 09:20:05.436668 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-os-release\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: I0319 09:20:05.436718 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-systemd-units\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: I0319 09:20:05.436748 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: I0319 09:20:05.436779 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-kubelet\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: I0319 09:20:05.436811 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-bin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: I0319 09:20:05.436840 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-hostroot\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: E0319 09:20:05.436922 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: E0319 09:20:05.436992 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.936970311 +0000 UTC m=+1.792309681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "node-tuning-operator-tls" not found Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: E0319 09:20:05.437046 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: E0319 09:20:05.437070 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert podName:9a6c1523-e77c-4aac-814c-05d41215c42f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.937062653 +0000 UTC m=+1.792402023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-5jsnd" (UID: "9a6c1523-e77c-4aac-814c-05d41215c42f") : secret "package-server-manager-serving-cert" not found Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: I0319 09:20:05.437107 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-system-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: I0319 09:20:05.437140 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-k8s-cni-cncf-io\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: I0319 09:20:05.437174 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-node-log\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: I0319 09:20:05.437207 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.437248 master-0 kubenswrapper[7457]: I0319 09:20:05.437238 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-netns\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: E0319 09:20:05.437284 7457 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: E0319 09:20:05.437309 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics podName:dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.937300709 +0000 UTC m=+1.792640069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-gxznr" (UID: "dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e") : secret "marketplace-operator-metrics" not found Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: E0319 09:20:05.437348 7457 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: E0319 09:20:05.437371 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.937363841 +0000 UTC m=+1.792703211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: I0319 09:20:05.437396 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-multus-certs\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: I0319 09:20:05.437478 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: I0319 09:20:05.437511 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-slash\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: I0319 09:20:05.437561 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: E0319 09:20:05.437609 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: E0319 09:20:05.437661 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.937652479 +0000 UTC m=+1.792991849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: I0319 09:20:05.437692 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-var-lib-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: I0319 09:20:05.437724 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-conf-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: I0319 09:20:05.437755 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-kubelet\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.437833 master-0 kubenswrapper[7457]: I0319 09:20:05.437799 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-os-release\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: E0319 09:20:05.437846 7457 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: E0319 09:20:05.437868 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls podName:16d2930b-486b-492d-983e-c6702d8f53a7 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.937860434 +0000 UTC m=+1.793199804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls") pod "dns-operator-9c5679d8f-cbw4r" (UID: "16d2930b-486b-492d-983e-c6702d8f53a7") : secret "metrics-tls" not found Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: I0319 09:20:05.437894 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f65184f-8fc2-4656-8776-a3b962aa1f5d-host-slash\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: E0319 09:20:05.437942 7457 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: E0319 09:20:05.437968 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs podName:8beda3a0-a653-4810-b3f2-d25badb21ab1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.937959927 +0000 UTC m=+1.793299297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-fvh8d" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1") : secret "multus-admission-controller-secret" not found Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: E0319 09:20:05.438011 7457 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: E0319 09:20:05.438034 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls podName:c247d991-809e-46b6-9617-9b05007b7560 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.938026518 +0000 UTC m=+1.793365888 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5m8t6" (UID: "c247d991-809e-46b6-9617-9b05007b7560") : secret "image-registry-operator-tls" not found Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: I0319 09:20:05.438072 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-cnibin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: I0319 09:20:05.438115 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-socket-dir-parent\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: E0319 09:20:05.438163 7457 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: E0319 09:20:05.438185 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.938177802 +0000 UTC m=+1.793517172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : secret "metrics-daemon-secret" not found Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: E0319 09:20:05.438225 7457 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: E0319 09:20:05.438249 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls podName:8c8ee765-76b8-4cde-8acb-6e5edd1b8149 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.938241284 +0000 UTC m=+1.793580654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-rtzvj" (UID: "8c8ee765-76b8-4cde-8acb-6e5edd1b8149") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:05.438297 master-0 kubenswrapper[7457]: I0319 09:20:05.438278 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-system-cni-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: I0319 09:20:05.438309 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-etc-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: I0319 09:20:05.438346 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-systemd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: I0319 09:20:05.438373 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-bin\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: E0319 09:20:05.438417 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: E0319 09:20:05.438441 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert podName:259794ab-d027-497a-b08e-5a6d79057668 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.938433828 +0000 UTC m=+1.793773198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert") pod "catalog-operator-68f85b4d6c-jg9m5" (UID: "259794ab-d027-497a-b08e-5a6d79057668") : secret "catalog-operator-serving-cert" not found Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: I0319 09:20:05.438478 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/4abcf2ea-50f5-4d62-8a23-583438e5b451-host-etc-kube\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: I0319 09:20:05.438510 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-netns\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: I0319 09:20:05.438555 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-log-socket\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: E0319 09:20:05.438600 7457 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: E0319 09:20:05.438622 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls podName:03d12dab-1215-4c1f-a9f5-27ea7174d308 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.938614363 +0000 UTC m=+1.793953733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls") pod "ingress-operator-66b84d69b-rvwfh" (UID: "03d12dab-1215-4c1f-a9f5-27ea7174d308") : secret "metrics-tls" not found Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: E0319 09:20:05.438662 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: E0319 09:20:05.438686 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert podName:7b29cb7b-26d2-4fab-9e03-2d7fdf937592 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:05.938676974 +0000 UTC m=+1.794016344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert") pod "olm-operator-5c9796789-rh692" (UID: "7b29cb7b-26d2-4fab-9e03-2d7fdf937592") : secret "olm-operator-serving-cert" not found Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: I0319 09:20:05.438752 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: I0319 09:20:05.438761 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-ovn\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.438804 master-0 kubenswrapper[7457]: I0319 09:20:05.438809 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.439384 master-0 kubenswrapper[7457]: I0319 09:20:05.438784 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-etc-kubernetes\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.439384 master-0 kubenswrapper[7457]: I0319 09:20:05.438842 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-cnibin\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:05.439384 master-0 kubenswrapper[7457]: I0319 09:20:05.438876 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-netd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:05.439384 master-0 kubenswrapper[7457]: I0319 09:20:05.438898 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-multus\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:05.439384 master-0 kubenswrapper[7457]: I0319 09:20:05.438919 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.776367 master-0 kubenswrapper[7457]: E0319 09:20:05.776310 7457 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:20:05.777017 master-0 kubenswrapper[7457]: E0319 09:20:05.776984 7457 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:05.942486 master-0 kubenswrapper[7457]: I0319 09:20:05.942404 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:05.942486 master-0 kubenswrapper[7457]: I0319 09:20:05.942462 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:05.942850 master-0 kubenswrapper[7457]: I0319 09:20:05.942505 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:05.942850 master-0 kubenswrapper[7457]: I0319 09:20:05.942627 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:20:05.942850 master-0 kubenswrapper[7457]: I0319 09:20:05.942651 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:20:05.942850 master-0 kubenswrapper[7457]: I0319 09:20:05.942674 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:05.942850 master-0 kubenswrapper[7457]: I0319 09:20:05.942705 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:20:05.942850 master-0 kubenswrapper[7457]: I0319 09:20:05.942730 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:05.942850 master-0 kubenswrapper[7457]: I0319 09:20:05.942793 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:20:05.942850 master-0 kubenswrapper[7457]: I0319 09:20:05.942845 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:05.943054 master-0 kubenswrapper[7457]: I0319 09:20:05.942868 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:20:05.943054 master-0 kubenswrapper[7457]: I0319 09:20:05.942986 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:05.943054 master-0 kubenswrapper[7457]: I0319 09:20:05.943037 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943171 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943235 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert podName:9a6c1523-e77c-4aac-814c-05d41215c42f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.943218923 +0000 UTC m=+2.798558313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-5jsnd" (UID: "9a6c1523-e77c-4aac-814c-05d41215c42f") : secret "package-server-manager-serving-cert" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943241 7457 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943281 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls podName:c247d991-809e-46b6-9617-9b05007b7560 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.943270204 +0000 UTC m=+2.798609574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5m8t6" (UID: "c247d991-809e-46b6-9617-9b05007b7560") : secret "image-registry-operator-tls" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943288 7457 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943320 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.943308615 +0000 UTC m=+2.798648005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : secret "metrics-daemon-secret" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943332 7457 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943353 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics podName:dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.943347126 +0000 UTC m=+2.798686496 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-gxznr" (UID: "dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e") : secret "marketplace-operator-metrics" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943380 7457 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943388 7457 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943405 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.943399977 +0000 UTC m=+2.798739347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943415 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls podName:8c8ee765-76b8-4cde-8acb-6e5edd1b8149 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.943411077 +0000 UTC m=+2.798750447 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-rtzvj" (UID: "8c8ee765-76b8-4cde-8acb-6e5edd1b8149") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943445 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943452 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943461 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.943456448 +0000 UTC m=+2.798795818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943481 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert podName:259794ab-d027-497a-b08e-5a6d79057668 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.943469959 +0000 UTC m=+2.798809349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert") pod "catalog-operator-68f85b4d6c-jg9m5" (UID: "259794ab-d027-497a-b08e-5a6d79057668") : secret "catalog-operator-serving-cert" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943491 7457 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943505 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls podName:16d2930b-486b-492d-983e-c6702d8f53a7 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.94350069 +0000 UTC m=+2.798840060 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls") pod "dns-operator-9c5679d8f-cbw4r" (UID: "16d2930b-486b-492d-983e-c6702d8f53a7") : secret "metrics-tls" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943561 7457 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943568 7457 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943593 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls podName:03d12dab-1215-4c1f-a9f5-27ea7174d308 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.943582942 +0000 UTC m=+2.798922332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls") pod "ingress-operator-66b84d69b-rvwfh" (UID: "03d12dab-1215-4c1f-a9f5-27ea7174d308") : secret "metrics-tls" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943612 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs podName:8beda3a0-a653-4810-b3f2-d25badb21ab1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.943603592 +0000 UTC m=+2.798942982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-fvh8d" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1") : secret "multus-admission-controller-secret" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943618 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:20:05.943649 master-0 kubenswrapper[7457]: E0319 09:20:05.943636 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert podName:7b29cb7b-26d2-4fab-9e03-2d7fdf937592 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.943630943 +0000 UTC m=+2.798970313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert") pod "olm-operator-5c9796789-rh692" (UID: "7b29cb7b-26d2-4fab-9e03-2d7fdf937592") : secret "olm-operator-serving-cert" not found Mar 19 09:20:05.944315 master-0 kubenswrapper[7457]: E0319 09:20:05.943665 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:20:05.944315 master-0 kubenswrapper[7457]: E0319 09:20:05.943697 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:06.943686174 +0000 UTC m=+2.799025554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "node-tuning-operator-tls" not found Mar 19 09:20:06.029018 master-0 kubenswrapper[7457]: E0319 09:20:06.027284 7457 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:06.029018 master-0 kubenswrapper[7457]: E0319 09:20:06.027768 7457 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:20:06.029018 master-0 kubenswrapper[7457]: W0319 09:20:06.028652 7457 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 19 09:20:06.029018 master-0 kubenswrapper[7457]: E0319 09:20:06.028691 7457 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:20:06.064246 master-0 kubenswrapper[7457]: I0319 09:20:06.064170 7457 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 19 09:20:06.064516 master-0 kubenswrapper[7457]: I0319 09:20:06.064487 7457 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 19 09:20:06.076785 master-0 kubenswrapper[7457]: I0319 09:20:06.076721 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-bound-sa-token\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:06.078553 master-0 kubenswrapper[7457]: I0319 09:20:06.078476 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j65pb\" (UniqueName: \"kubernetes.io/projected/4f65184f-8fc2-4656-8776-a3b962aa1f5d-kube-api-access-j65pb\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:20:06.081710 master-0 kubenswrapper[7457]: I0319 09:20:06.081665 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hnvh\" (UniqueName: \"kubernetes.io/projected/4abcf2ea-50f5-4d62-8a23-583438e5b451-kube-api-access-2hnvh\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:20:06.083680 master-0 kubenswrapper[7457]: I0319 09:20:06.083638 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bdnt\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-kube-api-access-6bdnt\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:06.085339 master-0 kubenswrapper[7457]: I0319 09:20:06.085297 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxjqg\" (UniqueName: \"kubernetes.io/projected/979d4d12-a560-4309-a1d3-cbebe853e8ea-kube-api-access-rxjqg\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:20:06.086062 master-0 kubenswrapper[7457]: I0319 09:20:06.086019 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvnb9\" (UniqueName: \"kubernetes.io/projected/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-kube-api-access-lvnb9\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:06.087724 master-0 kubenswrapper[7457]: I0319 09:20:06.087655 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:20:06.090863 master-0 kubenswrapper[7457]: I0319 09:20:06.090831 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6tp5\" (UniqueName: \"kubernetes.io/projected/9a6c1523-e77c-4aac-814c-05d41215c42f-kube-api-access-m6tp5\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:20:06.091052 master-0 kubenswrapper[7457]: I0319 09:20:06.091005 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/16d2930b-486b-492d-983e-c6702d8f53a7-kube-api-access-h5hk6\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:20:06.191514 master-0 kubenswrapper[7457]: I0319 09:20:06.191024 7457 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:20:06.266915 master-0 kubenswrapper[7457]: I0319 09:20:06.266870 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v88k\" (UniqueName: \"kubernetes.io/projected/259794ab-d027-497a-b08e-5a6d79057668-kube-api-access-6v88k\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:20:06.267351 master-0 kubenswrapper[7457]: I0319 09:20:06.267143 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqwbw\" (UniqueName: \"kubernetes.io/projected/e7fae040-28fa-4d97-8482-fd0dd12cc921-kube-api-access-jqwbw\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:20:06.267453 master-0 kubenswrapper[7457]: I0319 09:20:06.267421 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtw68\" (UniqueName: \"kubernetes.io/projected/41659a48-5eea-41cd-8b2a-b683dc15cc11-kube-api-access-jtw68\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:20:06.267885 master-0 kubenswrapper[7457]: I0319 09:20:06.267863 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgtgw\" (UniqueName: \"kubernetes.io/projected/8beda3a0-a653-4810-b3f2-d25badb21ab1-kube-api-access-tgtgw\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:20:06.268065 master-0 kubenswrapper[7457]: I0319 09:20:06.268028 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9blbc\" (UniqueName: \"kubernetes.io/projected/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-kube-api-access-9blbc\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:20:06.269916 master-0 kubenswrapper[7457]: I0319 09:20:06.269898 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hw6b\" (UniqueName: \"kubernetes.io/projected/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-kube-api-access-8hw6b\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:20:06.270845 master-0 kubenswrapper[7457]: I0319 09:20:06.270827 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5f5s\" (UniqueName: \"kubernetes.io/projected/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-kube-api-access-w5f5s\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:06.271631 master-0 kubenswrapper[7457]: I0319 09:20:06.271615 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4hqj\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-kube-api-access-v4hqj\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:06.274313 master-0 kubenswrapper[7457]: I0319 09:20:06.274288 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxfs\" (UniqueName: \"kubernetes.io/projected/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-kube-api-access-djxfs\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:06.277125 master-0 kubenswrapper[7457]: I0319 09:20:06.277031 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvd6f\" (UniqueName: \"kubernetes.io/projected/3b333a1e-2a7f-423a-8b40-99f30c89f740-kube-api-access-xvd6f\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:20:06.281134 master-0 kubenswrapper[7457]: I0319 09:20:06.281047 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51b88818-5108-40db-90c8-4f2e7198959e-kube-api-access\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:06.283427 master-0 kubenswrapper[7457]: I0319 09:20:06.283409 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d664acc4-ec4f-4078-ae93-404a14ea18fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:20:06.446778 master-0 kubenswrapper[7457]: I0319 09:20:06.446656 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:06.656558 master-0 kubenswrapper[7457]: I0319 09:20:06.656319 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:06.758752 master-0 kubenswrapper[7457]: I0319 09:20:06.758692 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp9jf\" (UniqueName: \"kubernetes.io/projected/8527f5cd-2992-44be-90b8-e9086cedf46e-kube-api-access-qp9jf\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:20:06.759751 master-0 kubenswrapper[7457]: I0319 09:20:06.759709 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rqsq\" (UniqueName: \"kubernetes.io/projected/3b50118d-f7c2-4bff-aca0-5c6623819baf-kube-api-access-6rqsq\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:20:06.760845 master-0 kubenswrapper[7457]: I0319 09:20:06.760804 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:20:06.765078 master-0 kubenswrapper[7457]: I0319 09:20:06.765039 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vcf6\" (UniqueName: \"kubernetes.io/projected/9076d131-644a-4332-8a70-34f6b0f71575-kube-api-access-2vcf6\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:06.766333 master-0 kubenswrapper[7457]: I0319 09:20:06.766291 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9plst\" (UniqueName: \"kubernetes.io/projected/8e073eb4-67f2-4de7-8848-50da73079dbc-kube-api-access-9plst\") pod \"csi-snapshot-controller-operator-5f5d689c6b-jv8lm\" (UID: \"8e073eb4-67f2-4de7-8848-50da73079dbc\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm" Mar 19 09:20:06.767087 master-0 kubenswrapper[7457]: I0319 09:20:06.767044 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0c75102-6790-4ed3-84da-61c3611186f8-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:20:06.767904 master-0 kubenswrapper[7457]: I0319 09:20:06.767863 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6dq\" (UniqueName: \"kubernetes.io/projected/43cb2a3b-40e2-45ee-894a-6c833ee17efd-kube-api-access-vf6dq\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:20:06.770418 master-0 kubenswrapper[7457]: I0319 09:20:06.770377 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhzsr\" (UniqueName: \"kubernetes.io/projected/157e3524-eb27-41ca-b49d-2697ee1245ca-kube-api-access-qhzsr\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:20:06.770825 master-0 kubenswrapper[7457]: I0319 09:20:06.770767 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfq74\" (UniqueName: \"kubernetes.io/projected/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-kube-api-access-sfq74\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:20:06.771190 master-0 kubenswrapper[7457]: I0319 09:20:06.771147 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7t5\" (UniqueName: \"kubernetes.io/projected/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-kube-api-access-vl7t5\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:20:06.771190 master-0 kubenswrapper[7457]: I0319 09:20:06.771165 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:06.783978 master-0 kubenswrapper[7457]: I0319 09:20:06.783921 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clpb5\" (UniqueName: \"kubernetes.io/projected/13072c08-c77c-4170-9ebe-98d63968747b-kube-api-access-clpb5\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:20:06.791488 master-0 kubenswrapper[7457]: I0319 09:20:06.790257 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:20:06.958867 master-0 kubenswrapper[7457]: I0319 09:20:06.958596 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:06.958867 master-0 kubenswrapper[7457]: E0319 09:20:06.958750 7457 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:06.958867 master-0 kubenswrapper[7457]: I0319 09:20:06.958766 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:20:06.959281 master-0 kubenswrapper[7457]: E0319 09:20:06.958835 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls podName:03d12dab-1215-4c1f-a9f5-27ea7174d308 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.958814454 +0000 UTC m=+4.814153824 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls") pod "ingress-operator-66b84d69b-rvwfh" (UID: "03d12dab-1215-4c1f-a9f5-27ea7174d308") : secret "metrics-tls" not found Mar 19 09:20:06.959281 master-0 kubenswrapper[7457]: E0319 09:20:06.959017 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:20:06.959388 master-0 kubenswrapper[7457]: E0319 09:20:06.959281 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert podName:7b29cb7b-26d2-4fab-9e03-2d7fdf937592 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.959245785 +0000 UTC m=+4.814585205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert") pod "olm-operator-5c9796789-rh692" (UID: "7b29cb7b-26d2-4fab-9e03-2d7fdf937592") : secret "olm-operator-serving-cert" not found Mar 19 09:20:06.959388 master-0 kubenswrapper[7457]: I0319 09:20:06.959354 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:06.959470 master-0 kubenswrapper[7457]: I0319 09:20:06.959393 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:20:06.959470 master-0 kubenswrapper[7457]: I0319 09:20:06.959432 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:06.959591 master-0 kubenswrapper[7457]: E0319 09:20:06.959541 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:20:06.959591 master-0 kubenswrapper[7457]: E0319 09:20:06.959553 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:20:06.959665 master-0 kubenswrapper[7457]: E0319 09:20:06.959597 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.959581154 +0000 UTC m=+4.814920564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "node-tuning-operator-tls" not found Mar 19 09:20:06.959665 master-0 kubenswrapper[7457]: E0319 09:20:06.959621 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert podName:9a6c1523-e77c-4aac-814c-05d41215c42f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.959610495 +0000 UTC m=+4.814949975 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-5jsnd" (UID: "9a6c1523-e77c-4aac-814c-05d41215c42f") : secret "package-server-manager-serving-cert" not found Mar 19 09:20:06.959665 master-0 kubenswrapper[7457]: E0319 09:20:06.959598 7457 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:20:06.959665 master-0 kubenswrapper[7457]: I0319 09:20:06.959650 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:06.959825 master-0 kubenswrapper[7457]: I0319 09:20:06.959699 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:06.959825 master-0 kubenswrapper[7457]: E0319 09:20:06.959717 7457 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:06.959825 master-0 kubenswrapper[7457]: I0319 09:20:06.959747 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:20:06.959825 master-0 kubenswrapper[7457]: E0319 09:20:06.959757 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.959743608 +0000 UTC m=+4.815082988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:06.959825 master-0 kubenswrapper[7457]: E0319 09:20:06.959811 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics podName:dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.95979756 +0000 UTC m=+4.815137020 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-gxznr" (UID: "dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e") : secret "marketplace-operator-metrics" not found Mar 19 09:20:06.960060 master-0 kubenswrapper[7457]: E0319 09:20:06.959929 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:06.960060 master-0 kubenswrapper[7457]: E0319 09:20:06.960040 7457 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:06.960060 master-0 kubenswrapper[7457]: E0319 09:20:06.960051 7457 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:20:06.960177 master-0 kubenswrapper[7457]: I0319 09:20:06.959943 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:20:06.960177 master-0 kubenswrapper[7457]: E0319 09:20:06.960080 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.960042216 +0000 UTC m=+4.815381626 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:06.960257 master-0 kubenswrapper[7457]: E0319 09:20:06.960201 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs podName:8beda3a0-a653-4810-b3f2-d25badb21ab1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.960172789 +0000 UTC m=+4.815512239 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-fvh8d" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1") : secret "multus-admission-controller-secret" not found Mar 19 09:20:06.960340 master-0 kubenswrapper[7457]: I0319 09:20:06.960277 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:06.960340 master-0 kubenswrapper[7457]: E0319 09:20:06.960320 7457 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:20:06.960423 master-0 kubenswrapper[7457]: E0319 09:20:06.960373 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls podName:16d2930b-486b-492d-983e-c6702d8f53a7 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.960360834 +0000 UTC m=+4.815700204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls") pod "dns-operator-9c5679d8f-cbw4r" (UID: "16d2930b-486b-492d-983e-c6702d8f53a7") : secret "metrics-tls" not found Mar 19 09:20:06.960423 master-0 kubenswrapper[7457]: I0319 09:20:06.960368 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:20:06.960423 master-0 kubenswrapper[7457]: E0319 09:20:06.960396 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls podName:c247d991-809e-46b6-9617-9b05007b7560 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.960388484 +0000 UTC m=+4.815727854 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5m8t6" (UID: "c247d991-809e-46b6-9617-9b05007b7560") : secret "image-registry-operator-tls" not found Mar 19 09:20:06.960423 master-0 kubenswrapper[7457]: E0319 09:20:06.960405 7457 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:20:06.960590 master-0 kubenswrapper[7457]: I0319 09:20:06.960449 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:06.960590 master-0 kubenswrapper[7457]: E0319 09:20:06.960482 7457 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:06.960590 master-0 kubenswrapper[7457]: E0319 09:20:06.960522 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.960508697 +0000 UTC m=+4.815848147 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : secret "metrics-daemon-secret" not found Mar 19 09:20:06.960590 master-0 kubenswrapper[7457]: E0319 09:20:06.960567 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:20:06.960590 master-0 kubenswrapper[7457]: I0319 09:20:06.960520 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:20:06.960819 master-0 kubenswrapper[7457]: E0319 09:20:06.960596 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls podName:8c8ee765-76b8-4cde-8acb-6e5edd1b8149 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.960573849 +0000 UTC m=+4.815913309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-rtzvj" (UID: "8c8ee765-76b8-4cde-8acb-6e5edd1b8149") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:06.960819 master-0 kubenswrapper[7457]: E0319 09:20:06.960665 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert podName:259794ab-d027-497a-b08e-5a6d79057668 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:08.960649341 +0000 UTC m=+4.815988751 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert") pod "catalog-operator-68f85b4d6c-jg9m5" (UID: "259794ab-d027-497a-b08e-5a6d79057668") : secret "catalog-operator-serving-cert" not found Mar 19 09:20:07.550819 master-0 kubenswrapper[7457]: I0319 09:20:07.550744 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:07.884404 master-0 kubenswrapper[7457]: I0319 09:20:07.884118 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:08.089738 master-0 kubenswrapper[7457]: I0319 09:20:08.089698 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:08.095101 master-0 kubenswrapper[7457]: I0319 09:20:08.095068 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:08.399562 master-0 kubenswrapper[7457]: I0319 09:20:08.399499 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:08.399996 master-0 kubenswrapper[7457]: I0319 09:20:08.399966 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:20:09.020776 master-0 kubenswrapper[7457]: I0319 09:20:09.020690 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: I0319 09:20:09.020801 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.020842 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: I0319 09:20:09.020857 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.020893 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.020875137 +0000 UTC m=+8.876214507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: I0319 09:20:09.020913 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: I0319 09:20:09.020938 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: I0319 09:20:09.020955 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.020993 7457 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: I0319 09:20:09.021030 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: I0319 09:20:09.021051 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021070 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls podName:16d2930b-486b-492d-983e-c6702d8f53a7 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.021049651 +0000 UTC m=+8.876389051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls") pod "dns-operator-9c5679d8f-cbw4r" (UID: "16d2930b-486b-492d-983e-c6702d8f53a7") : secret "metrics-tls" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021093 7457 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: I0319 09:20:09.021106 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021116 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls podName:03d12dab-1215-4c1f-a9f5-27ea7174d308 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.021107043 +0000 UTC m=+8.876446403 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls") pod "ingress-operator-66b84d69b-rvwfh" (UID: "03d12dab-1215-4c1f-a9f5-27ea7174d308") : secret "metrics-tls" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021152 7457 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021262 7457 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021293 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls podName:8c8ee765-76b8-4cde-8acb-6e5edd1b8149 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.021271727 +0000 UTC m=+8.876611187 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-rtzvj" (UID: "8c8ee765-76b8-4cde-8acb-6e5edd1b8149") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021333 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.021314478 +0000 UTC m=+8.876653908 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : secret "metrics-daemon-secret" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: I0319 09:20:09.021263 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: I0319 09:20:09.021373 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: I0319 09:20:09.021418 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021422 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021517 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert podName:259794ab-d027-497a-b08e-5a6d79057668 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.021494932 +0000 UTC m=+8.876834312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert") pod "catalog-operator-68f85b4d6c-jg9m5" (UID: "259794ab-d027-497a-b08e-5a6d79057668") : secret "catalog-operator-serving-cert" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021568 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021627 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021645 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert podName:9a6c1523-e77c-4aac-814c-05d41215c42f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.021623056 +0000 UTC m=+8.876962456 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-5jsnd" (UID: "9a6c1523-e77c-4aac-814c-05d41215c42f") : secret "package-server-manager-serving-cert" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021677 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.021662167 +0000 UTC m=+8.877001657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "node-tuning-operator-tls" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021682 7457 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: I0319 09:20:09.021715 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021730 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls podName:c247d991-809e-46b6-9617-9b05007b7560 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.021716888 +0000 UTC m=+8.877056388 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5m8t6" (UID: "c247d991-809e-46b6-9617-9b05007b7560") : secret "image-registry-operator-tls" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021745 7457 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021572 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021773 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs podName:8beda3a0-a653-4810-b3f2-d25badb21ab1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.021763439 +0000 UTC m=+8.877102909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-fvh8d" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1") : secret "multus-admission-controller-secret" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021794 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert podName:7b29cb7b-26d2-4fab-9e03-2d7fdf937592 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.02177956 +0000 UTC m=+8.877119050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert") pod "olm-operator-5c9796789-rh692" (UID: "7b29cb7b-26d2-4fab-9e03-2d7fdf937592") : secret "olm-operator-serving-cert" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021813 7457 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021817 7457 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021831 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics podName:dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.021825611 +0000 UTC m=+8.877164981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-gxznr" (UID: "dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e") : secret "marketplace-operator-metrics" not found Mar 19 09:20:09.021868 master-0 kubenswrapper[7457]: E0319 09:20:09.021854 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:13.021840571 +0000 UTC m=+8.877179961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:09.025857 master-0 kubenswrapper[7457]: E0319 09:20:09.025136 7457 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263" Mar 19 09:20:09.025857 master-0 kubenswrapper[7457]: E0319 09:20:09.025306 7457 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:service-ca-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263,Command:[service-ca-operator operator],Args:[--config=/var/run/configmaps/config/operator-config.yaml -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{83886080 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9blbc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-ca-operator-b865698dc-f6kkd_openshift-service-ca-operator(1694c93a-9acb-4bec-bfd6-3ec370e7a0b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:20:09.026609 master-0 kubenswrapper[7457]: E0319 09:20:09.026576 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" podUID="1694c93a-9acb-4bec-bfd6-3ec370e7a0b4" Mar 19 09:20:10.112456 master-0 kubenswrapper[7457]: E0319 09:20:10.112384 7457 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458" Mar 19 09:20:10.113260 master-0 kubenswrapper[7457]: E0319 09:20:10.112615 7457 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-controller-manager-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458,Command:[cluster-kube-controller-manager-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458,ValueFrom:nil,},EnvVar{Name:CLUSTER_POLICY_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d,ValueFrom:nil,},EnvVar{Name:TOOLS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:67c988e079558dc6b20232ebf9a7f7276fee60c756caed584c9715e0bec77a5a,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:1.31.14,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-controller-manager-operator-ff989d6cc-pvlq6_openshift-kube-controller-manager-operator(f0c75102-6790-4ed3-84da-61c3611186f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:20:10.113859 master-0 kubenswrapper[7457]: E0319 09:20:10.113820 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" podUID="f0c75102-6790-4ed3-84da-61c3611186f8" Mar 19 09:20:11.214470 master-0 kubenswrapper[7457]: E0319 09:20:11.214346 7457 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e" Mar 19 09:20:11.215023 master-0 kubenswrapper[7457]: E0319 09:20:11.214704 7457 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openshift-apiserver-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e,Command:[cluster-openshift-apiserver-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:KUBE_APISERVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xvd6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-apiserver-operator-d65958b8-55s59_openshift-apiserver-operator(3b333a1e-2a7f-423a-8b40-99f30c89f740): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:20:11.215928 master-0 kubenswrapper[7457]: E0319 09:20:11.215895 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" podUID="3b333a1e-2a7f-423a-8b40-99f30c89f740" Mar 19 09:20:11.951841 master-0 kubenswrapper[7457]: I0319 09:20:11.951745 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:12.171886 master-0 kubenswrapper[7457]: E0319 09:20:12.171802 7457 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252" Mar 19 09:20:12.172072 master-0 kubenswrapper[7457]: E0319 09:20:12.171990 7457 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-storage-version-migrator-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252,Command:[cluster-kube-storage-version-migrator-operator start],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951ecfeba9b2da4b653034d09275f925396a79c2d8461b8a7c71c776fee67ba0,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vf6dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7_openshift-kube-storage-version-migrator-operator(43cb2a3b-40e2-45ee-894a-6c833ee17efd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:20:12.173339 master-0 kubenswrapper[7457]: E0319 09:20:12.173223 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" podUID="43cb2a3b-40e2-45ee-894a-6c833ee17efd" Mar 19 09:20:12.584303 master-0 kubenswrapper[7457]: E0319 09:20:12.584208 7457 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85" Mar 19 09:20:12.585088 master-0 kubenswrapper[7457]: E0319 09:20:12.584387 7457 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:copy-catalogd-manifests,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85,Command:[/bin/sh],Args:[-c cp -a /openshift/manifests /operand-assets/catalogd],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:operand-assets,ReadOnly:false,MountPath:/operand-assets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6rqsq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000380000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cluster-olm-operator-67dcd4998-p9czl_openshift-cluster-olm-operator(3b50118d-f7c2-4bff-aca0-5c6623819baf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:20:12.585756 master-0 kubenswrapper[7457]: E0319 09:20:12.585689 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"copy-catalogd-manifests\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" podUID="3b50118d-f7c2-4bff-aca0-5c6623819baf" Mar 19 09:20:13.074562 master-0 kubenswrapper[7457]: I0319 09:20:13.074486 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:20:13.074562 master-0 kubenswrapper[7457]: I0319 09:20:13.074556 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:13.074889 master-0 kubenswrapper[7457]: E0319 09:20:13.074672 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:20:13.074889 master-0 kubenswrapper[7457]: E0319 09:20:13.074765 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert podName:259794ab-d027-497a-b08e-5a6d79057668 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.074743065 +0000 UTC m=+16.930082545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert") pod "catalog-operator-68f85b4d6c-jg9m5" (UID: "259794ab-d027-497a-b08e-5a6d79057668") : secret "catalog-operator-serving-cert" not found Mar 19 09:20:13.074889 master-0 kubenswrapper[7457]: I0319 09:20:13.074810 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:20:13.074889 master-0 kubenswrapper[7457]: I0319 09:20:13.074867 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:13.075062 master-0 kubenswrapper[7457]: I0319 09:20:13.074894 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:20:13.075062 master-0 kubenswrapper[7457]: I0319 09:20:13.074930 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:13.075062 master-0 kubenswrapper[7457]: I0319 09:20:13.074955 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:13.075062 master-0 kubenswrapper[7457]: I0319 09:20:13.074983 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:13.075062 master-0 kubenswrapper[7457]: I0319 09:20:13.075014 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:20:13.075062 master-0 kubenswrapper[7457]: I0319 09:20:13.075039 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:20:13.075062 master-0 kubenswrapper[7457]: I0319 09:20:13.075063 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:13.075259 master-0 kubenswrapper[7457]: I0319 09:20:13.075091 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:20:13.075259 master-0 kubenswrapper[7457]: I0319 09:20:13.075118 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:13.075259 master-0 kubenswrapper[7457]: E0319 09:20:13.075193 7457 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:13.075259 master-0 kubenswrapper[7457]: E0319 09:20:13.075221 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls podName:8c8ee765-76b8-4cde-8acb-6e5edd1b8149 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.075212377 +0000 UTC m=+16.930551837 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-rtzvj" (UID: "8c8ee765-76b8-4cde-8acb-6e5edd1b8149") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:13.075367 master-0 kubenswrapper[7457]: E0319 09:20:13.075266 7457 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:13.075367 master-0 kubenswrapper[7457]: E0319 09:20:13.075293 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls podName:03d12dab-1215-4c1f-a9f5-27ea7174d308 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.07528488 +0000 UTC m=+16.930624250 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls") pod "ingress-operator-66b84d69b-rvwfh" (UID: "03d12dab-1215-4c1f-a9f5-27ea7174d308") : secret "metrics-tls" not found Mar 19 09:20:13.075367 master-0 kubenswrapper[7457]: E0319 09:20:13.075336 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:20:13.075367 master-0 kubenswrapper[7457]: E0319 09:20:13.075360 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert podName:7b29cb7b-26d2-4fab-9e03-2d7fdf937592 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.075352721 +0000 UTC m=+16.930692091 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert") pod "olm-operator-5c9796789-rh692" (UID: "7b29cb7b-26d2-4fab-9e03-2d7fdf937592") : secret "olm-operator-serving-cert" not found Mar 19 09:20:13.075737 master-0 kubenswrapper[7457]: E0319 09:20:13.075400 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:20:13.075737 master-0 kubenswrapper[7457]: E0319 09:20:13.075424 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.075416663 +0000 UTC m=+16.930756043 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "node-tuning-operator-tls" not found Mar 19 09:20:13.075737 master-0 kubenswrapper[7457]: E0319 09:20:13.075464 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:20:13.075737 master-0 kubenswrapper[7457]: E0319 09:20:13.075488 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert podName:9a6c1523-e77c-4aac-814c-05d41215c42f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.075480475 +0000 UTC m=+16.930819955 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-5jsnd" (UID: "9a6c1523-e77c-4aac-814c-05d41215c42f") : secret "package-server-manager-serving-cert" not found Mar 19 09:20:13.075737 master-0 kubenswrapper[7457]: E0319 09:20:13.075552 7457 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:20:13.075737 master-0 kubenswrapper[7457]: E0319 09:20:13.075580 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics podName:dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.075572157 +0000 UTC m=+16.930911527 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-gxznr" (UID: "dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e") : secret "marketplace-operator-metrics" not found Mar 19 09:20:13.075737 master-0 kubenswrapper[7457]: E0319 09:20:13.075625 7457 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:13.075737 master-0 kubenswrapper[7457]: E0319 09:20:13.075647 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.075639318 +0000 UTC m=+16.930978688 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:13.075737 master-0 kubenswrapper[7457]: E0319 09:20:13.075686 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:13.075737 master-0 kubenswrapper[7457]: E0319 09:20:13.075709 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.07570199 +0000 UTC m=+16.931041360 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:13.075737 master-0 kubenswrapper[7457]: E0319 09:20:13.075747 7457 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:13.076035 master-0 kubenswrapper[7457]: E0319 09:20:13.075773 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls podName:16d2930b-486b-492d-983e-c6702d8f53a7 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.075765642 +0000 UTC m=+16.931105012 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls") pod "dns-operator-9c5679d8f-cbw4r" (UID: "16d2930b-486b-492d-983e-c6702d8f53a7") : secret "metrics-tls" not found Mar 19 09:20:13.076035 master-0 kubenswrapper[7457]: E0319 09:20:13.075819 7457 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:20:13.076035 master-0 kubenswrapper[7457]: E0319 09:20:13.075841 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs podName:8beda3a0-a653-4810-b3f2-d25badb21ab1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.075834363 +0000 UTC m=+16.931173733 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-fvh8d" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1") : secret "multus-admission-controller-secret" not found Mar 19 09:20:13.076035 master-0 kubenswrapper[7457]: E0319 09:20:13.075881 7457 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:20:13.076035 master-0 kubenswrapper[7457]: E0319 09:20:13.075904 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls podName:c247d991-809e-46b6-9617-9b05007b7560 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.075896845 +0000 UTC m=+16.931236215 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5m8t6" (UID: "c247d991-809e-46b6-9617-9b05007b7560") : secret "image-registry-operator-tls" not found Mar 19 09:20:13.076035 master-0 kubenswrapper[7457]: E0319 09:20:13.075944 7457 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:20:13.076035 master-0 kubenswrapper[7457]: E0319 09:20:13.075966 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:21.075959636 +0000 UTC m=+16.931299006 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : secret "metrics-daemon-secret" not found Mar 19 09:20:13.187127 master-0 kubenswrapper[7457]: I0319 09:20:13.187061 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:13.222268 master-0 kubenswrapper[7457]: I0319 09:20:13.222198 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:13.409362 master-0 kubenswrapper[7457]: I0319 09:20:13.409263 7457 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:20:13.409362 master-0 kubenswrapper[7457]: I0319 09:20:13.409289 7457 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:20:14.404772 master-0 kubenswrapper[7457]: E0319 09:20:14.404688 7457 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55" Mar 19 09:20:14.405422 master-0 kubenswrapper[7457]: E0319 09:20:14.404908 7457 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j65pb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-qfc76_openshift-network-operator(4f65184f-8fc2-4656-8776-a3b962aa1f5d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:20:14.406302 master-0 kubenswrapper[7457]: E0319 09:20:14.406202 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-network-operator/iptables-alerter-qfc76" podUID="4f65184f-8fc2-4656-8776-a3b962aa1f5d" Mar 19 09:20:14.814735 master-0 kubenswrapper[7457]: I0319 09:20:14.814183 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:14.821537 master-0 kubenswrapper[7457]: I0319 09:20:14.821453 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:15.021146 master-0 kubenswrapper[7457]: I0319 09:20:15.020993 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:15.021462 master-0 kubenswrapper[7457]: I0319 09:20:15.021405 7457 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:20:15.021462 master-0 kubenswrapper[7457]: I0319 09:20:15.021444 7457 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:20:15.057515 master-0 kubenswrapper[7457]: I0319 09:20:15.057442 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:15.415404 master-0 kubenswrapper[7457]: I0319 09:20:15.415301 7457 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:20:15.423370 master-0 kubenswrapper[7457]: I0319 09:20:15.423305 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:20:15.908284 master-0 kubenswrapper[7457]: I0319 09:20:15.908211 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:15.926699 master-0 kubenswrapper[7457]: I0319 09:20:15.926636 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:20:16.646885 master-0 kubenswrapper[7457]: E0319 09:20:16.646754 7457 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3" Mar 19 09:20:16.647556 master-0 kubenswrapper[7457]: E0319 09:20:16.647353 7457 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:csi-snapshot-controller-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3,Command:[],Args:[start -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERAND_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9609c00207cc4db97f0fd6162eb429d7f81654137f020a677e30cba26a887a24,ValueFrom:nil,},EnvVar{Name:WEBHOOK_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e46378af340ca82a8551fdfa20d0acf4ff4a5d43ceb0d4748eebc55be437d04,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9plst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000150000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-snapshot-controller-operator-5f5d689c6b-jv8lm_openshift-cluster-storage-operator(8e073eb4-67f2-4de7-8848-50da73079dbc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:20:16.648616 master-0 kubenswrapper[7457]: E0319 09:20:16.648560 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"csi-snapshot-controller-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm" podUID="8e073eb4-67f2-4de7-8848-50da73079dbc" Mar 19 09:20:17.321783 master-0 kubenswrapper[7457]: I0319 09:20:17.321391 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4s5vc"] Mar 19 09:20:17.326835 master-0 kubenswrapper[7457]: W0319 09:20:17.326792 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 WatchSource:0}: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:20:17.429381 master-0 kubenswrapper[7457]: I0319 09:20:17.429294 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" event={"ID":"d664acc4-ec4f-4078-ae93-404a14ea18fc","Type":"ContainerStarted","Data":"f068dc00867ec832963c43c66c2b3ba5e5c27207844ca25057536cc59dfa3810"} Mar 19 09:20:17.433804 master-0 kubenswrapper[7457]: I0319 09:20:17.433740 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" event={"ID":"a1098584-43b9-4f2c-83d2-22d95fb7b0c3","Type":"ContainerStarted","Data":"dbe5b6ac78d411669d4c2885f202f3dc2681af9deb4ef2161f47be9747a76bd6"} Mar 19 09:20:17.435263 master-0 kubenswrapper[7457]: I0319 09:20:17.435209 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" event={"ID":"8527f5cd-2992-44be-90b8-e9086cedf46e","Type":"ContainerStarted","Data":"c9da4601818f501772c5c387239e3219ab4432a2bb45b7271b716c82c40ddaf7"} Mar 19 09:20:17.436563 master-0 kubenswrapper[7457]: I0319 09:20:17.436496 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" event={"ID":"e7fae040-28fa-4d97-8482-fd0dd12cc921","Type":"ContainerStarted","Data":"7899eaeea83e799e75607f310011944713a832305f4796c7131bde2f6c40224c"} Mar 19 09:20:21.172193 master-0 kubenswrapper[7457]: I0319 09:20:21.171730 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:21.172193 master-0 kubenswrapper[7457]: E0319 09:20:21.172086 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.172287 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: I0319 09:20:21.172176 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.172330 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.172290884 +0000 UTC m=+33.027630474 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "node-tuning-operator-tls" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.172366 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert podName:9a6c1523-e77c-4aac-814c-05d41215c42f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.172352566 +0000 UTC m=+33.027692176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-5jsnd" (UID: "9a6c1523-e77c-4aac-814c-05d41215c42f") : secret "package-server-manager-serving-cert" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: I0319 09:20:21.172460 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: I0319 09:20:21.172505 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.172627 7457 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.172666 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.172651583 +0000 UTC m=+33.027990953 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.172781 7457 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.172841 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.172869 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.172861938 +0000 UTC m=+33.028201308 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: I0319 09:20:21.172796 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.172886 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics podName:dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.172878319 +0000 UTC m=+33.028217679 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-gxznr" (UID: "dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e") : secret "marketplace-operator-metrics" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: I0319 09:20:21.172982 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: I0319 09:20:21.173029 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: I0319 09:20:21.173073 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173095 7457 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: I0319 09:20:21.173105 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173132 7457 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173135 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls podName:16d2930b-486b-492d-983e-c6702d8f53a7 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.173118575 +0000 UTC m=+33.028457945 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls") pod "dns-operator-9c5679d8f-cbw4r" (UID: "16d2930b-486b-492d-983e-c6702d8f53a7") : secret "metrics-tls" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173185 7457 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: I0319 09:20:21.173193 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173212 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls podName:c247d991-809e-46b6-9617-9b05007b7560 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.173205507 +0000 UTC m=+33.028544877 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5m8t6" (UID: "c247d991-809e-46b6-9617-9b05007b7560") : secret "image-registry-operator-tls" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173185 7457 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173226 7457 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: I0319 09:20:21.173244 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173253 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs podName:8beda3a0-a653-4810-b3f2-d25badb21ab1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.173243408 +0000 UTC m=+33.028582778 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-fvh8d" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1") : secret "multus-admission-controller-secret" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173280 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173293 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls podName:8c8ee765-76b8-4cde-8acb-6e5edd1b8149 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.173278589 +0000 UTC m=+33.028618159 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-rtzvj" (UID: "8c8ee765-76b8-4cde-8acb-6e5edd1b8149") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173324 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.173307009 +0000 UTC m=+33.028646629 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : secret "metrics-daemon-secret" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: I0319 09:20:21.173347 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: I0319 09:20:21.173392 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173425 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert podName:259794ab-d027-497a-b08e-5a6d79057668 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.173416672 +0000 UTC m=+33.028756042 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert") pod "catalog-operator-68f85b4d6c-jg9m5" (UID: "259794ab-d027-497a-b08e-5a6d79057668") : secret "catalog-operator-serving-cert" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173442 7457 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173491 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls podName:03d12dab-1215-4c1f-a9f5-27ea7174d308 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.173462833 +0000 UTC m=+33.028802433 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls") pod "ingress-operator-66b84d69b-rvwfh" (UID: "03d12dab-1215-4c1f-a9f5-27ea7174d308") : secret "metrics-tls" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173559 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:20:21.173866 master-0 kubenswrapper[7457]: E0319 09:20:21.173594 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert podName:7b29cb7b-26d2-4fab-9e03-2d7fdf937592 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.173586256 +0000 UTC m=+33.028925626 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert") pod "olm-operator-5c9796789-rh692" (UID: "7b29cb7b-26d2-4fab-9e03-2d7fdf937592") : secret "olm-operator-serving-cert" not found Mar 19 09:20:35.165860 master-0 kubenswrapper[7457]: I0319 09:20:35.164465 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x"] Mar 19 09:20:35.165860 master-0 kubenswrapper[7457]: E0319 09:20:35.164957 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" containerName="assisted-installer-controller" Mar 19 09:20:35.165860 master-0 kubenswrapper[7457]: I0319 09:20:35.164971 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" containerName="assisted-installer-controller" Mar 19 09:20:35.165860 master-0 kubenswrapper[7457]: E0319 09:20:35.164979 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" containerName="prober" Mar 19 09:20:35.165860 master-0 kubenswrapper[7457]: I0319 09:20:35.164986 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" containerName="prober" Mar 19 09:20:35.165860 master-0 kubenswrapper[7457]: I0319 09:20:35.165045 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" containerName="prober" Mar 19 09:20:35.165860 master-0 kubenswrapper[7457]: I0319 09:20:35.165054 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" containerName="assisted-installer-controller" Mar 19 09:20:35.165860 master-0 kubenswrapper[7457]: I0319 09:20:35.165328 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.171109 master-0 kubenswrapper[7457]: I0319 09:20:35.166724 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:20:35.171109 master-0 kubenswrapper[7457]: I0319 09:20:35.167498 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:20:35.171109 master-0 kubenswrapper[7457]: I0319 09:20:35.167750 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:20:35.171109 master-0 kubenswrapper[7457]: I0319 09:20:35.167978 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:20:35.171109 master-0 kubenswrapper[7457]: I0319 09:20:35.168173 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:20:35.182486 master-0 kubenswrapper[7457]: I0319 09:20:35.181044 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x"] Mar 19 09:20:35.207952 master-0 kubenswrapper[7457]: I0319 09:20:35.207915 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc"] Mar 19 09:20:35.210924 master-0 kubenswrapper[7457]: I0319 09:20:35.208703 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.227981 master-0 kubenswrapper[7457]: I0319 09:20:35.221518 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:20:35.227981 master-0 kubenswrapper[7457]: I0319 09:20:35.221538 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:20:35.227981 master-0 kubenswrapper[7457]: I0319 09:20:35.221859 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:20:35.227981 master-0 kubenswrapper[7457]: I0319 09:20:35.223018 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:20:35.227981 master-0 kubenswrapper[7457]: I0319 09:20:35.226306 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:20:35.227981 master-0 kubenswrapper[7457]: I0319 09:20:35.227451 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc"] Mar 19 09:20:35.237364 master-0 kubenswrapper[7457]: I0319 09:20:35.237312 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:20:35.271585 master-0 kubenswrapper[7457]: I0319 09:20:35.270826 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.271585 master-0 kubenswrapper[7457]: I0319 09:20:35.270874 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-client-ca\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.271585 master-0 kubenswrapper[7457]: I0319 09:20:35.270912 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddc5c712-7d46-4bb9-947e-4d08c5a16102-serving-cert\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.271585 master-0 kubenswrapper[7457]: I0319 09:20:35.270930 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-proxy-ca-bundles\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.271585 master-0 kubenswrapper[7457]: I0319 09:20:35.270973 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk9cq\" (UniqueName: \"kubernetes.io/projected/a43e1754-66ff-49c4-8e64-65be7bae2819-kube-api-access-lk9cq\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.271585 master-0 kubenswrapper[7457]: I0319 09:20:35.271014 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-config\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.271585 master-0 kubenswrapper[7457]: I0319 09:20:35.271091 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.271585 master-0 kubenswrapper[7457]: I0319 09:20:35.271145 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c4vn\" (UniqueName: \"kubernetes.io/projected/ddc5c712-7d46-4bb9-947e-4d08c5a16102-kube-api-access-8c4vn\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.271585 master-0 kubenswrapper[7457]: I0319 09:20:35.271191 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-config\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.373053 master-0 kubenswrapper[7457]: I0319 09:20:35.373001 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddc5c712-7d46-4bb9-947e-4d08c5a16102-serving-cert\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.373358 master-0 kubenswrapper[7457]: I0319 09:20:35.373345 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-proxy-ca-bundles\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.374096 master-0 kubenswrapper[7457]: E0319 09:20:35.374055 7457 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:35.374162 master-0 kubenswrapper[7457]: E0319 09:20:35.374135 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc5c712-7d46-4bb9-947e-4d08c5a16102-serving-cert podName:ddc5c712-7d46-4bb9-947e-4d08c5a16102 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:35.874115207 +0000 UTC m=+31.729454577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ddc5c712-7d46-4bb9-947e-4d08c5a16102-serving-cert") pod "controller-manager-7bb65fd8bc-rfsjc" (UID: "ddc5c712-7d46-4bb9-947e-4d08c5a16102") : secret "serving-cert" not found Mar 19 09:20:35.374353 master-0 kubenswrapper[7457]: I0319 09:20:35.374331 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk9cq\" (UniqueName: \"kubernetes.io/projected/a43e1754-66ff-49c4-8e64-65be7bae2819-kube-api-access-lk9cq\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.374459 master-0 kubenswrapper[7457]: I0319 09:20:35.374442 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.374577 master-0 kubenswrapper[7457]: I0319 09:20:35.374563 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-config\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.374658 master-0 kubenswrapper[7457]: I0319 09:20:35.374646 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c4vn\" (UniqueName: \"kubernetes.io/projected/ddc5c712-7d46-4bb9-947e-4d08c5a16102-kube-api-access-8c4vn\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.374745 master-0 kubenswrapper[7457]: I0319 09:20:35.374732 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-config\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.374855 master-0 kubenswrapper[7457]: I0319 09:20:35.374840 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.374949 master-0 kubenswrapper[7457]: I0319 09:20:35.374934 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-client-ca\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.375193 master-0 kubenswrapper[7457]: E0319 09:20:35.375175 7457 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:35.375314 master-0 kubenswrapper[7457]: E0319 09:20:35.375303 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert podName:a43e1754-66ff-49c4-8e64-65be7bae2819 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:35.875280516 +0000 UTC m=+31.730619876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert") pod "route-controller-manager-86584b59c9-tws4x" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819") : secret "serving-cert" not found Mar 19 09:20:35.375412 master-0 kubenswrapper[7457]: E0319 09:20:35.375398 7457 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:35.375516 master-0 kubenswrapper[7457]: E0319 09:20:35.375506 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-client-ca podName:ddc5c712-7d46-4bb9-947e-4d08c5a16102 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:35.875498033 +0000 UTC m=+31.730837403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-client-ca") pod "controller-manager-7bb65fd8bc-rfsjc" (UID: "ddc5c712-7d46-4bb9-947e-4d08c5a16102") : configmap "client-ca" not found Mar 19 09:20:35.376369 master-0 kubenswrapper[7457]: E0319 09:20:35.376313 7457 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:35.376503 master-0 kubenswrapper[7457]: E0319 09:20:35.376476 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca podName:a43e1754-66ff-49c4-8e64-65be7bae2819 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:35.876449287 +0000 UTC m=+31.731788857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca") pod "route-controller-manager-86584b59c9-tws4x" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819") : configmap "client-ca" not found Mar 19 09:20:35.376609 master-0 kubenswrapper[7457]: I0319 09:20:35.376556 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-config\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.376998 master-0 kubenswrapper[7457]: I0319 09:20:35.376982 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-config\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.378769 master-0 kubenswrapper[7457]: I0319 09:20:35.378748 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-proxy-ca-bundles\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.400648 master-0 kubenswrapper[7457]: I0319 09:20:35.400599 7457 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 09:20:35.406799 master-0 kubenswrapper[7457]: I0319 09:20:35.406754 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk9cq\" (UniqueName: \"kubernetes.io/projected/a43e1754-66ff-49c4-8e64-65be7bae2819-kube-api-access-lk9cq\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.408071 master-0 kubenswrapper[7457]: I0319 09:20:35.407219 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c4vn\" (UniqueName: \"kubernetes.io/projected/ddc5c712-7d46-4bb9-947e-4d08c5a16102-kube-api-access-8c4vn\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.497144 master-0 kubenswrapper[7457]: I0319 09:20:35.497071 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" event={"ID":"3b333a1e-2a7f-423a-8b40-99f30c89f740","Type":"ContainerStarted","Data":"e7857b0cae9f1e592c846367f20964b7bdba92f2c028bce9260e23037d2618d9"} Mar 19 09:20:35.499078 master-0 kubenswrapper[7457]: I0319 09:20:35.499037 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" event={"ID":"43cb2a3b-40e2-45ee-894a-6c833ee17efd","Type":"ContainerStarted","Data":"c4276c1e12973c262c98545548719e35835681298a10338c9d6009cc8f7eb867"} Mar 19 09:20:35.503456 master-0 kubenswrapper[7457]: I0319 09:20:35.503020 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" event={"ID":"f0c75102-6790-4ed3-84da-61c3611186f8","Type":"ContainerStarted","Data":"46cd0596efe1a555d079c79fdb72a64ad03bb94cd6e0d19c502033e4b3f35b63"} Mar 19 09:20:35.505006 master-0 kubenswrapper[7457]: I0319 09:20:35.504963 7457 generic.go:334] "Generic (PLEG): container finished" podID="3b50118d-f7c2-4bff-aca0-5c6623819baf" containerID="eda46613435f0ad25039ff0c6a8755c37babfb4638110ab33aa3ce1f440dd317" exitCode=0 Mar 19 09:20:35.505149 master-0 kubenswrapper[7457]: I0319 09:20:35.505130 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" event={"ID":"3b50118d-f7c2-4bff-aca0-5c6623819baf","Type":"ContainerDied","Data":"eda46613435f0ad25039ff0c6a8755c37babfb4638110ab33aa3ce1f440dd317"} Mar 19 09:20:35.507826 master-0 kubenswrapper[7457]: I0319 09:20:35.507138 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" event={"ID":"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4","Type":"ContainerStarted","Data":"33c416f3ddb853fb82ea998149e13a2a8f2bd563b1774b31ddf6b2c491ae3aa9"} Mar 19 09:20:35.509498 master-0 kubenswrapper[7457]: I0319 09:20:35.509433 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm" event={"ID":"8e073eb4-67f2-4de7-8848-50da73079dbc","Type":"ContainerStarted","Data":"a4f08ffa469d576f1ff721d7a848a9dcbe02ec450f751dda967c9dc0db841d3f"} Mar 19 09:20:35.881658 master-0 kubenswrapper[7457]: I0319 09:20:35.881608 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.881983 master-0 kubenswrapper[7457]: E0319 09:20:35.881792 7457 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:35.881983 master-0 kubenswrapper[7457]: E0319 09:20:35.881892 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert podName:a43e1754-66ff-49c4-8e64-65be7bae2819 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:36.881869878 +0000 UTC m=+32.737209328 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert") pod "route-controller-manager-86584b59c9-tws4x" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819") : secret "serving-cert" not found Mar 19 09:20:35.881983 master-0 kubenswrapper[7457]: I0319 09:20:35.881952 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:35.882119 master-0 kubenswrapper[7457]: I0319 09:20:35.882014 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-client-ca\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.882119 master-0 kubenswrapper[7457]: E0319 09:20:35.882097 7457 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:35.882203 master-0 kubenswrapper[7457]: I0319 09:20:35.882113 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddc5c712-7d46-4bb9-947e-4d08c5a16102-serving-cert\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:35.882299 master-0 kubenswrapper[7457]: E0319 09:20:35.882167 7457 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:35.882349 master-0 kubenswrapper[7457]: E0319 09:20:35.882184 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca podName:a43e1754-66ff-49c4-8e64-65be7bae2819 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:36.882158245 +0000 UTC m=+32.737497695 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca") pod "route-controller-manager-86584b59c9-tws4x" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819") : configmap "client-ca" not found Mar 19 09:20:35.882349 master-0 kubenswrapper[7457]: E0319 09:20:35.882188 7457 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:35.882349 master-0 kubenswrapper[7457]: E0319 09:20:35.882336 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc5c712-7d46-4bb9-947e-4d08c5a16102-serving-cert podName:ddc5c712-7d46-4bb9-947e-4d08c5a16102 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:36.8823251 +0000 UTC m=+32.737664530 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ddc5c712-7d46-4bb9-947e-4d08c5a16102-serving-cert") pod "controller-manager-7bb65fd8bc-rfsjc" (UID: "ddc5c712-7d46-4bb9-947e-4d08c5a16102") : secret "serving-cert" not found Mar 19 09:20:35.882478 master-0 kubenswrapper[7457]: E0319 09:20:35.882364 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-client-ca podName:ddc5c712-7d46-4bb9-947e-4d08c5a16102 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:36.882352341 +0000 UTC m=+32.737691791 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-client-ca") pod "controller-manager-7bb65fd8bc-rfsjc" (UID: "ddc5c712-7d46-4bb9-947e-4d08c5a16102") : configmap "client-ca" not found Mar 19 09:20:36.104393 master-0 kubenswrapper[7457]: I0319 09:20:36.104329 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc"] Mar 19 09:20:36.104727 master-0 kubenswrapper[7457]: E0319 09:20:36.104642 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" podUID="ddc5c712-7d46-4bb9-947e-4d08c5a16102" Mar 19 09:20:36.202979 master-0 kubenswrapper[7457]: I0319 09:20:36.202866 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c"] Mar 19 09:20:36.203935 master-0 kubenswrapper[7457]: I0319 09:20:36.203390 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" Mar 19 09:20:36.220932 master-0 kubenswrapper[7457]: I0319 09:20:36.220886 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c"] Mar 19 09:20:36.295562 master-0 kubenswrapper[7457]: I0319 09:20:36.293236 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww85l\" (UniqueName: \"kubernetes.io/projected/dc65ec1f-b8fb-40d6-ac39-46b255a33221-kube-api-access-ww85l\") pod \"csi-snapshot-controller-64854d9cff-v9s9c\" (UID: \"dc65ec1f-b8fb-40d6-ac39-46b255a33221\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" Mar 19 09:20:36.394171 master-0 kubenswrapper[7457]: I0319 09:20:36.394098 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww85l\" (UniqueName: \"kubernetes.io/projected/dc65ec1f-b8fb-40d6-ac39-46b255a33221-kube-api-access-ww85l\") pod \"csi-snapshot-controller-64854d9cff-v9s9c\" (UID: \"dc65ec1f-b8fb-40d6-ac39-46b255a33221\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" Mar 19 09:20:36.416995 master-0 kubenswrapper[7457]: I0319 09:20:36.416943 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww85l\" (UniqueName: \"kubernetes.io/projected/dc65ec1f-b8fb-40d6-ac39-46b255a33221-kube-api-access-ww85l\") pod \"csi-snapshot-controller-64854d9cff-v9s9c\" (UID: \"dc65ec1f-b8fb-40d6-ac39-46b255a33221\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" Mar 19 09:20:36.514332 master-0 kubenswrapper[7457]: I0319 09:20:36.514198 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:36.522847 master-0 kubenswrapper[7457]: I0319 09:20:36.522710 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:36.531078 master-0 kubenswrapper[7457]: I0319 09:20:36.530666 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" Mar 19 09:20:36.597856 master-0 kubenswrapper[7457]: I0319 09:20:36.597027 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-config\") pod \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " Mar 19 09:20:36.597856 master-0 kubenswrapper[7457]: I0319 09:20:36.597094 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-proxy-ca-bundles\") pod \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " Mar 19 09:20:36.597856 master-0 kubenswrapper[7457]: I0319 09:20:36.597139 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c4vn\" (UniqueName: \"kubernetes.io/projected/ddc5c712-7d46-4bb9-947e-4d08c5a16102-kube-api-access-8c4vn\") pod \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " Mar 19 09:20:36.597856 master-0 kubenswrapper[7457]: I0319 09:20:36.597539 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-config" (OuterVolumeSpecName: "config") pod "ddc5c712-7d46-4bb9-947e-4d08c5a16102" (UID: "ddc5c712-7d46-4bb9-947e-4d08c5a16102"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:36.598118 master-0 kubenswrapper[7457]: I0319 09:20:36.598011 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ddc5c712-7d46-4bb9-947e-4d08c5a16102" (UID: "ddc5c712-7d46-4bb9-947e-4d08c5a16102"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:36.601280 master-0 kubenswrapper[7457]: I0319 09:20:36.601234 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddc5c712-7d46-4bb9-947e-4d08c5a16102-kube-api-access-8c4vn" (OuterVolumeSpecName: "kube-api-access-8c4vn") pod "ddc5c712-7d46-4bb9-947e-4d08c5a16102" (UID: "ddc5c712-7d46-4bb9-947e-4d08c5a16102"). InnerVolumeSpecName "kube-api-access-8c4vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:20:36.699133 master-0 kubenswrapper[7457]: I0319 09:20:36.699072 7457 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:36.699133 master-0 kubenswrapper[7457]: I0319 09:20:36.699123 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c4vn\" (UniqueName: \"kubernetes.io/projected/ddc5c712-7d46-4bb9-947e-4d08c5a16102-kube-api-access-8c4vn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:36.699133 master-0 kubenswrapper[7457]: I0319 09:20:36.699141 7457 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:36.739946 master-0 kubenswrapper[7457]: I0319 09:20:36.739894 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c"] Mar 19 09:20:36.901065 master-0 kubenswrapper[7457]: I0319 09:20:36.901006 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:36.901278 master-0 kubenswrapper[7457]: I0319 09:20:36.901087 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:36.901278 master-0 kubenswrapper[7457]: E0319 09:20:36.901185 7457 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:36.901278 master-0 kubenswrapper[7457]: I0319 09:20:36.901220 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-client-ca\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:36.901278 master-0 kubenswrapper[7457]: E0319 09:20:36.901252 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert podName:a43e1754-66ff-49c4-8e64-65be7bae2819 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.901231539 +0000 UTC m=+34.756570909 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert") pod "route-controller-manager-86584b59c9-tws4x" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819") : secret "serving-cert" not found Mar 19 09:20:36.901278 master-0 kubenswrapper[7457]: E0319 09:20:36.901271 7457 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:36.901517 master-0 kubenswrapper[7457]: E0319 09:20:36.901298 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-client-ca podName:ddc5c712-7d46-4bb9-947e-4d08c5a16102 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.90128833 +0000 UTC m=+34.756627700 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-client-ca") pod "controller-manager-7bb65fd8bc-rfsjc" (UID: "ddc5c712-7d46-4bb9-947e-4d08c5a16102") : configmap "client-ca" not found Mar 19 09:20:36.901591 master-0 kubenswrapper[7457]: E0319 09:20:36.901484 7457 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:36.901781 master-0 kubenswrapper[7457]: E0319 09:20:36.901658 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca podName:a43e1754-66ff-49c4-8e64-65be7bae2819 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.901616469 +0000 UTC m=+34.756955979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca") pod "route-controller-manager-86584b59c9-tws4x" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819") : configmap "client-ca" not found Mar 19 09:20:36.901781 master-0 kubenswrapper[7457]: I0319 09:20:36.901717 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddc5c712-7d46-4bb9-947e-4d08c5a16102-serving-cert\") pod \"controller-manager-7bb65fd8bc-rfsjc\" (UID: \"ddc5c712-7d46-4bb9-947e-4d08c5a16102\") " pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:36.902317 master-0 kubenswrapper[7457]: E0319 09:20:36.902013 7457 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:36.902317 master-0 kubenswrapper[7457]: E0319 09:20:36.902104 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc5c712-7d46-4bb9-947e-4d08c5a16102-serving-cert podName:ddc5c712-7d46-4bb9-947e-4d08c5a16102 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.902083731 +0000 UTC m=+34.757423101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ddc5c712-7d46-4bb9-947e-4d08c5a16102-serving-cert") pod "controller-manager-7bb65fd8bc-rfsjc" (UID: "ddc5c712-7d46-4bb9-947e-4d08c5a16102") : secret "serving-cert" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.205843 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.206261 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206074 7457 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.206303 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.206330 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206355 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs podName:13072c08-c77c-4170-9ebe-98d63968747b nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.206332779 +0000 UTC m=+65.061672149 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs") pod "network-metrics-daemon-nq9vs" (UID: "13072c08-c77c-4170-9ebe-98d63968747b") : secret "metrics-daemon-secret" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.206392 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206427 7457 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206467 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls podName:8c8ee765-76b8-4cde-8acb-6e5edd1b8149 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.206457012 +0000 UTC m=+65.061796442 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-rtzvj" (UID: "8c8ee765-76b8-4cde-8acb-6e5edd1b8149") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.206490 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.206520 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206552 7457 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206586 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls podName:03d12dab-1215-4c1f-a9f5-27ea7174d308 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.206577225 +0000 UTC m=+65.061916675 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls") pod "ingress-operator-66b84d69b-rvwfh" (UID: "03d12dab-1215-4c1f-a9f5-27ea7174d308") : secret "metrics-tls" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.206618 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206674 7457 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206686 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206713 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206747 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206722 7457 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206705 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics podName:dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.206694848 +0000 UTC m=+65.062034218 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-gxznr" (UID: "dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e") : secret "marketplace-operator-metrics" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206836 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.206820522 +0000 UTC m=+65.062159892 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "node-tuning-operator-tls" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206851 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert podName:259794ab-d027-497a-b08e-5a6d79057668 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.206843213 +0000 UTC m=+65.062182583 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert") pod "catalog-operator-68f85b4d6c-jg9m5" (UID: "259794ab-d027-497a-b08e-5a6d79057668") : secret "catalog-operator-serving-cert" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206864 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert podName:7b29cb7b-26d2-4fab-9e03-2d7fdf937592 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.206858533 +0000 UTC m=+65.062197903 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert") pod "olm-operator-5c9796789-rh692" (UID: "7b29cb7b-26d2-4fab-9e03-2d7fdf937592") : secret "olm-operator-serving-cert" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.206874 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert podName:9a6c1523-e77c-4aac-814c-05d41215c42f nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.206869504 +0000 UTC m=+65.062208874 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-5jsnd" (UID: "9a6c1523-e77c-4aac-814c-05d41215c42f") : secret "package-server-manager-serving-cert" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.206891 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.206922 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.206948 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.206964 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: I0319 09:20:37.206982 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.207080 7457 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.207108 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls podName:c247d991-809e-46b6-9617-9b05007b7560 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.207098929 +0000 UTC m=+65.062438299 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5m8t6" (UID: "c247d991-809e-46b6-9617-9b05007b7560") : secret "image-registry-operator-tls" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.207178 7457 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.207198 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert podName:51b88818-5108-40db-90c8-4f2e7198959e nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.207191952 +0000 UTC m=+65.062531322 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert") pod "cluster-version-operator-56d8475767-prd2q" (UID: "51b88818-5108-40db-90c8-4f2e7198959e") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.207228 7457 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.207247 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert podName:9076d131-644a-4332-8a70-34f6b0f71575 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.207240113 +0000 UTC m=+65.062579493 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-smksb" (UID: "9076d131-644a-4332-8a70-34f6b0f71575") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.207287 7457 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.207310 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls podName:16d2930b-486b-492d-983e-c6702d8f53a7 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.207301604 +0000 UTC m=+65.062640974 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls") pod "dns-operator-9c5679d8f-cbw4r" (UID: "16d2930b-486b-492d-983e-c6702d8f53a7") : secret "metrics-tls" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.207350 7457 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:20:37.207709 master-0 kubenswrapper[7457]: E0319 09:20:37.207373 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs podName:8beda3a0-a653-4810-b3f2-d25badb21ab1 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.207366026 +0000 UTC m=+65.062705396 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-fvh8d" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1") : secret "multus-admission-controller-secret" not found Mar 19 09:20:37.518752 master-0 kubenswrapper[7457]: I0319 09:20:37.518725 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc" Mar 19 09:20:37.524670 master-0 kubenswrapper[7457]: I0319 09:20:37.524637 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" event={"ID":"dc65ec1f-b8fb-40d6-ac39-46b255a33221","Type":"ContainerStarted","Data":"2702ffe2a7096514b0cf147d61b08f45ac487590697d47b826f39e03c4994a7d"} Mar 19 09:20:37.554125 master-0 kubenswrapper[7457]: I0319 09:20:37.554066 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59bcdf994c-hgrrg"] Mar 19 09:20:37.554565 master-0 kubenswrapper[7457]: I0319 09:20:37.554487 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.556060 master-0 kubenswrapper[7457]: I0319 09:20:37.556017 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:20:37.559849 master-0 kubenswrapper[7457]: I0319 09:20:37.559808 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:20:37.560040 master-0 kubenswrapper[7457]: I0319 09:20:37.560012 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:20:37.560629 master-0 kubenswrapper[7457]: I0319 09:20:37.560184 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:20:37.578333 master-0 kubenswrapper[7457]: I0319 09:20:37.572831 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:20:37.578333 master-0 kubenswrapper[7457]: I0319 09:20:37.576278 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc"] Mar 19 09:20:37.578333 master-0 kubenswrapper[7457]: I0319 09:20:37.576320 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59bcdf994c-hgrrg"] Mar 19 09:20:37.578333 master-0 kubenswrapper[7457]: I0319 09:20:37.577774 7457 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7bb65fd8bc-rfsjc"] Mar 19 09:20:37.584591 master-0 kubenswrapper[7457]: I0319 09:20:37.580089 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:20:37.619556 master-0 kubenswrapper[7457]: I0319 09:20:37.612052 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-config\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.619556 master-0 kubenswrapper[7457]: I0319 09:20:37.612119 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx8gb\" (UniqueName: \"kubernetes.io/projected/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-kube-api-access-kx8gb\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.619556 master-0 kubenswrapper[7457]: I0319 09:20:37.612296 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.619556 master-0 kubenswrapper[7457]: I0319 09:20:37.612492 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.627562 master-0 kubenswrapper[7457]: I0319 09:20:37.612767 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-proxy-ca-bundles\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.728367 master-0 kubenswrapper[7457]: I0319 09:20:37.728298 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-config\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.728367 master-0 kubenswrapper[7457]: I0319 09:20:37.728358 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx8gb\" (UniqueName: \"kubernetes.io/projected/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-kube-api-access-kx8gb\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.728601 master-0 kubenswrapper[7457]: I0319 09:20:37.728558 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.728669 master-0 kubenswrapper[7457]: I0319 09:20:37.728639 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.728768 master-0 kubenswrapper[7457]: E0319 09:20:37.728733 7457 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:37.728768 master-0 kubenswrapper[7457]: E0319 09:20:37.728757 7457 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:37.728865 master-0 kubenswrapper[7457]: E0319 09:20:37.728802 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert podName:c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.228785259 +0000 UTC m=+34.084124629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert") pod "controller-manager-59bcdf994c-hgrrg" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28") : secret "serving-cert" not found Mar 19 09:20:37.728865 master-0 kubenswrapper[7457]: E0319 09:20:37.728815 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca podName:c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.228809739 +0000 UTC m=+34.084149109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca") pod "controller-manager-59bcdf994c-hgrrg" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28") : configmap "client-ca" not found Mar 19 09:20:37.728964 master-0 kubenswrapper[7457]: I0319 09:20:37.728879 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-proxy-ca-bundles\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.729136 master-0 kubenswrapper[7457]: I0319 09:20:37.729104 7457 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddc5c712-7d46-4bb9-947e-4d08c5a16102-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:37.729136 master-0 kubenswrapper[7457]: I0319 09:20:37.729130 7457 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddc5c712-7d46-4bb9-947e-4d08c5a16102-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:37.729802 master-0 kubenswrapper[7457]: I0319 09:20:37.729770 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-config\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.730120 master-0 kubenswrapper[7457]: I0319 09:20:37.730079 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-proxy-ca-bundles\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:37.756710 master-0 kubenswrapper[7457]: I0319 09:20:37.747846 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx8gb\" (UniqueName: \"kubernetes.io/projected/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-kube-api-access-kx8gb\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:38.102004 master-0 kubenswrapper[7457]: I0319 09:20:38.095505 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-g9497"] Mar 19 09:20:38.102004 master-0 kubenswrapper[7457]: I0319 09:20:38.096105 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-8487694857-g9497" Mar 19 09:20:38.102004 master-0 kubenswrapper[7457]: I0319 09:20:38.098234 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 09:20:38.102004 master-0 kubenswrapper[7457]: I0319 09:20:38.098568 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 09:20:38.115387 master-0 kubenswrapper[7457]: I0319 09:20:38.115347 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-g9497"] Mar 19 09:20:38.234705 master-0 kubenswrapper[7457]: I0319 09:20:38.234662 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg4cn\" (UniqueName: \"kubernetes.io/projected/56e11aac-d199-404a-a0e2-82c28926746d-kube-api-access-pg4cn\") pod \"migrator-8487694857-g9497\" (UID: \"56e11aac-d199-404a-a0e2-82c28926746d\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-g9497" Mar 19 09:20:38.235538 master-0 kubenswrapper[7457]: I0319 09:20:38.234850 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:38.235538 master-0 kubenswrapper[7457]: I0319 09:20:38.234926 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:38.235538 master-0 kubenswrapper[7457]: E0319 09:20:38.235127 7457 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:38.235538 master-0 kubenswrapper[7457]: E0319 09:20:38.235182 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert podName:c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:39.235162805 +0000 UTC m=+35.090502175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert") pod "controller-manager-59bcdf994c-hgrrg" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28") : secret "serving-cert" not found Mar 19 09:20:38.235538 master-0 kubenswrapper[7457]: E0319 09:20:38.235226 7457 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:38.235538 master-0 kubenswrapper[7457]: E0319 09:20:38.235250 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca podName:c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:39.235241847 +0000 UTC m=+35.090581217 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca") pod "controller-manager-59bcdf994c-hgrrg" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28") : configmap "client-ca" not found Mar 19 09:20:38.336397 master-0 kubenswrapper[7457]: I0319 09:20:38.336261 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg4cn\" (UniqueName: \"kubernetes.io/projected/56e11aac-d199-404a-a0e2-82c28926746d-kube-api-access-pg4cn\") pod \"migrator-8487694857-g9497\" (UID: \"56e11aac-d199-404a-a0e2-82c28926746d\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-g9497" Mar 19 09:20:38.340256 master-0 kubenswrapper[7457]: I0319 09:20:38.340202 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddc5c712-7d46-4bb9-947e-4d08c5a16102" path="/var/lib/kubelet/pods/ddc5c712-7d46-4bb9-947e-4d08c5a16102/volumes" Mar 19 09:20:38.353383 master-0 kubenswrapper[7457]: I0319 09:20:38.353330 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg4cn\" (UniqueName: \"kubernetes.io/projected/56e11aac-d199-404a-a0e2-82c28926746d-kube-api-access-pg4cn\") pod \"migrator-8487694857-g9497\" (UID: \"56e11aac-d199-404a-a0e2-82c28926746d\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-g9497" Mar 19 09:20:38.448026 master-0 kubenswrapper[7457]: I0319 09:20:38.447941 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-8487694857-g9497" Mar 19 09:20:38.535742 master-0 kubenswrapper[7457]: I0319 09:20:38.535327 7457 generic.go:334] "Generic (PLEG): container finished" podID="3b50118d-f7c2-4bff-aca0-5c6623819baf" containerID="a44bc43b2d58d1a0d645e857d97d66ce4eb842ccd368241fdd8860524859bfed" exitCode=0 Mar 19 09:20:38.535857 master-0 kubenswrapper[7457]: I0319 09:20:38.535425 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" event={"ID":"3b50118d-f7c2-4bff-aca0-5c6623819baf","Type":"ContainerDied","Data":"a44bc43b2d58d1a0d645e857d97d66ce4eb842ccd368241fdd8860524859bfed"} Mar 19 09:20:38.539706 master-0 kubenswrapper[7457]: I0319 09:20:38.539668 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qfc76" event={"ID":"4f65184f-8fc2-4656-8776-a3b962aa1f5d","Type":"ContainerStarted","Data":"745cf3b5879d3ecd2ab4d9ea3ff970b75f3167e744c5ed97b87f84b2e9a89610"} Mar 19 09:20:38.640867 master-0 kubenswrapper[7457]: I0319 09:20:38.640587 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-g9497"] Mar 19 09:20:38.656828 master-0 kubenswrapper[7457]: W0319 09:20:38.656766 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56e11aac_d199_404a_a0e2_82c28926746d.slice/crio-2e855eb091c3c19e500b45068cb0c0c879f09feb5f816f86f9d1253a9f1c5dcd WatchSource:0}: Error finding container 2e855eb091c3c19e500b45068cb0c0c879f09feb5f816f86f9d1253a9f1c5dcd: Status 404 returned error can't find the container with id 2e855eb091c3c19e500b45068cb0c0c879f09feb5f816f86f9d1253a9f1c5dcd Mar 19 09:20:38.949221 master-0 kubenswrapper[7457]: I0319 09:20:38.949093 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:38.949221 master-0 kubenswrapper[7457]: I0319 09:20:38.949197 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:38.949492 master-0 kubenswrapper[7457]: E0319 09:20:38.949336 7457 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:38.949492 master-0 kubenswrapper[7457]: E0319 09:20:38.949382 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca podName:a43e1754-66ff-49c4-8e64-65be7bae2819 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:42.949369234 +0000 UTC m=+38.804708604 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca") pod "route-controller-manager-86584b59c9-tws4x" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819") : configmap "client-ca" not found Mar 19 09:20:38.949832 master-0 kubenswrapper[7457]: E0319 09:20:38.949800 7457 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:38.949885 master-0 kubenswrapper[7457]: E0319 09:20:38.949835 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert podName:a43e1754-66ff-49c4-8e64-65be7bae2819 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:42.949827226 +0000 UTC m=+38.805166596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert") pod "route-controller-manager-86584b59c9-tws4x" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819") : secret "serving-cert" not found Mar 19 09:20:38.967492 master-0 kubenswrapper[7457]: I0319 09:20:38.966575 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-l54xv"] Mar 19 09:20:38.967492 master-0 kubenswrapper[7457]: I0319 09:20:38.967120 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:20:38.969289 master-0 kubenswrapper[7457]: I0319 09:20:38.969216 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 09:20:38.969643 master-0 kubenswrapper[7457]: I0319 09:20:38.969569 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 09:20:38.969643 master-0 kubenswrapper[7457]: I0319 09:20:38.969595 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 09:20:38.969787 master-0 kubenswrapper[7457]: I0319 09:20:38.969658 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 09:20:38.984013 master-0 kubenswrapper[7457]: I0319 09:20:38.983953 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-l54xv"] Mar 19 09:20:39.052971 master-0 kubenswrapper[7457]: I0319 09:20:39.051919 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a3dddb56-d180-4b8a-85bd-77c3888d8f71-signing-key\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:20:39.052971 master-0 kubenswrapper[7457]: I0319 09:20:39.052034 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a3dddb56-d180-4b8a-85bd-77c3888d8f71-signing-cabundle\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:20:39.052971 master-0 kubenswrapper[7457]: I0319 09:20:39.052093 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxbdq\" (UniqueName: \"kubernetes.io/projected/a3dddb56-d180-4b8a-85bd-77c3888d8f71-kube-api-access-nxbdq\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:20:39.153788 master-0 kubenswrapper[7457]: I0319 09:20:39.153718 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a3dddb56-d180-4b8a-85bd-77c3888d8f71-signing-cabundle\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:20:39.154263 master-0 kubenswrapper[7457]: I0319 09:20:39.154224 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbdq\" (UniqueName: \"kubernetes.io/projected/a3dddb56-d180-4b8a-85bd-77c3888d8f71-kube-api-access-nxbdq\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:20:39.154810 master-0 kubenswrapper[7457]: I0319 09:20:39.154766 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a3dddb56-d180-4b8a-85bd-77c3888d8f71-signing-key\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:20:39.156284 master-0 kubenswrapper[7457]: I0319 09:20:39.156255 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a3dddb56-d180-4b8a-85bd-77c3888d8f71-signing-cabundle\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:20:39.163972 master-0 kubenswrapper[7457]: I0319 09:20:39.163935 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a3dddb56-d180-4b8a-85bd-77c3888d8f71-signing-key\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:20:39.172602 master-0 kubenswrapper[7457]: I0319 09:20:39.172515 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbdq\" (UniqueName: \"kubernetes.io/projected/a3dddb56-d180-4b8a-85bd-77c3888d8f71-kube-api-access-nxbdq\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:20:39.255911 master-0 kubenswrapper[7457]: I0319 09:20:39.255810 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:39.256841 master-0 kubenswrapper[7457]: I0319 09:20:39.256821 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:39.257090 master-0 kubenswrapper[7457]: E0319 09:20:39.255965 7457 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:39.257213 master-0 kubenswrapper[7457]: E0319 09:20:39.257200 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca podName:c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:41.257178834 +0000 UTC m=+37.112518214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca") pod "controller-manager-59bcdf994c-hgrrg" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28") : configmap "client-ca" not found Mar 19 09:20:39.257291 master-0 kubenswrapper[7457]: E0319 09:20:39.257017 7457 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:39.257393 master-0 kubenswrapper[7457]: E0319 09:20:39.257382 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert podName:c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:41.257373289 +0000 UTC m=+37.112712649 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert") pod "controller-manager-59bcdf994c-hgrrg" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28") : secret "serving-cert" not found Mar 19 09:20:39.291836 master-0 kubenswrapper[7457]: I0319 09:20:39.291719 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:20:39.506303 master-0 kubenswrapper[7457]: I0319 09:20:39.505871 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-l54xv"] Mar 19 09:20:39.513569 master-0 kubenswrapper[7457]: W0319 09:20:39.513519 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3dddb56_d180_4b8a_85bd_77c3888d8f71.slice/crio-db26592d73a7736c3517fdc2847e1043ee77738e2bbfdfcabaf0fa7701a43b04 WatchSource:0}: Error finding container db26592d73a7736c3517fdc2847e1043ee77738e2bbfdfcabaf0fa7701a43b04: Status 404 returned error can't find the container with id db26592d73a7736c3517fdc2847e1043ee77738e2bbfdfcabaf0fa7701a43b04 Mar 19 09:20:39.545657 master-0 kubenswrapper[7457]: I0319 09:20:39.545604 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" event={"ID":"dc65ec1f-b8fb-40d6-ac39-46b255a33221","Type":"ContainerStarted","Data":"5732fa3ff6aaea0289273acea825bfaab46efed575658d801c96fd54df3453e0"} Mar 19 09:20:39.547285 master-0 kubenswrapper[7457]: I0319 09:20:39.547239 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" event={"ID":"a3dddb56-d180-4b8a-85bd-77c3888d8f71","Type":"ContainerStarted","Data":"db26592d73a7736c3517fdc2847e1043ee77738e2bbfdfcabaf0fa7701a43b04"} Mar 19 09:20:39.549162 master-0 kubenswrapper[7457]: I0319 09:20:39.549100 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-g9497" event={"ID":"56e11aac-d199-404a-a0e2-82c28926746d","Type":"ContainerStarted","Data":"2e855eb091c3c19e500b45068cb0c0c879f09feb5f816f86f9d1253a9f1c5dcd"} Mar 19 09:20:39.566207 master-0 kubenswrapper[7457]: I0319 09:20:39.566082 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" podStartSLOduration=1.00684974 podStartE2EDuration="3.566047362s" podCreationTimestamp="2026-03-19 09:20:36 +0000 UTC" firstStartedPulling="2026-03-19 09:20:36.750965158 +0000 UTC m=+32.606304528" lastFinishedPulling="2026-03-19 09:20:39.31016279 +0000 UTC m=+35.165502150" observedRunningTime="2026-03-19 09:20:39.565239291 +0000 UTC m=+35.420578661" watchObservedRunningTime="2026-03-19 09:20:39.566047362 +0000 UTC m=+35.421386732" Mar 19 09:20:40.556759 master-0 kubenswrapper[7457]: I0319 09:20:40.556687 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" event={"ID":"a3dddb56-d180-4b8a-85bd-77c3888d8f71","Type":"ContainerStarted","Data":"95dacee2cb30e36457f1d33ad364f9a224d5660270f3f7bb7614ed2e09f9ce55"} Mar 19 09:20:40.596557 master-0 kubenswrapper[7457]: I0319 09:20:40.595771 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" podStartSLOduration=2.59575638 podStartE2EDuration="2.59575638s" podCreationTimestamp="2026-03-19 09:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:20:40.59381396 +0000 UTC m=+36.449153330" watchObservedRunningTime="2026-03-19 09:20:40.59575638 +0000 UTC m=+36.451095750" Mar 19 09:20:41.312282 master-0 kubenswrapper[7457]: I0319 09:20:41.312215 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:41.312499 master-0 kubenswrapper[7457]: I0319 09:20:41.312404 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:41.312499 master-0 kubenswrapper[7457]: E0319 09:20:41.312417 7457 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:41.312606 master-0 kubenswrapper[7457]: E0319 09:20:41.312516 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca podName:c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:45.312495565 +0000 UTC m=+41.167835005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca") pod "controller-manager-59bcdf994c-hgrrg" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28") : configmap "client-ca" not found Mar 19 09:20:41.312606 master-0 kubenswrapper[7457]: E0319 09:20:41.312517 7457 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:41.312606 master-0 kubenswrapper[7457]: E0319 09:20:41.312591 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert podName:c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:45.312578087 +0000 UTC m=+41.167917447 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert") pod "controller-manager-59bcdf994c-hgrrg" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28") : secret "serving-cert" not found Mar 19 09:20:42.568918 master-0 kubenswrapper[7457]: I0319 09:20:42.568861 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" event={"ID":"3b50118d-f7c2-4bff-aca0-5c6623819baf","Type":"ContainerStarted","Data":"4be81bb4984289b1445cac9a7d29a8575166e1227c0a164f03ad826b2adf5846"} Mar 19 09:20:42.571282 master-0 kubenswrapper[7457]: I0319 09:20:42.571188 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-g9497" event={"ID":"56e11aac-d199-404a-a0e2-82c28926746d","Type":"ContainerStarted","Data":"a6fdb30a6cb98e65c9b430c67eefdded66fbbe8da20ef275c57b4d7843aa1504"} Mar 19 09:20:42.571282 master-0 kubenswrapper[7457]: I0319 09:20:42.571267 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-g9497" event={"ID":"56e11aac-d199-404a-a0e2-82c28926746d","Type":"ContainerStarted","Data":"04ab539d0892371c0a0e303a0171da833221893f892a45bc222d5e9e06986cc9"} Mar 19 09:20:42.814553 master-0 kubenswrapper[7457]: I0319 09:20:42.811773 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-8487694857-g9497" podStartSLOduration=1.687318097 podStartE2EDuration="4.811755s" podCreationTimestamp="2026-03-19 09:20:38 +0000 UTC" firstStartedPulling="2026-03-19 09:20:38.65885667 +0000 UTC m=+34.514196040" lastFinishedPulling="2026-03-19 09:20:41.783293573 +0000 UTC m=+37.638632943" observedRunningTime="2026-03-19 09:20:42.81099881 +0000 UTC m=+38.666338170" watchObservedRunningTime="2026-03-19 09:20:42.811755 +0000 UTC m=+38.667094370" Mar 19 09:20:43.029938 master-0 kubenswrapper[7457]: I0319 09:20:43.029861 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:43.030153 master-0 kubenswrapper[7457]: I0319 09:20:43.029978 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:43.030153 master-0 kubenswrapper[7457]: E0319 09:20:43.030014 7457 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:43.030153 master-0 kubenswrapper[7457]: E0319 09:20:43.030069 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert podName:a43e1754-66ff-49c4-8e64-65be7bae2819 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:51.030054514 +0000 UTC m=+46.885393884 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert") pod "route-controller-manager-86584b59c9-tws4x" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819") : secret "serving-cert" not found Mar 19 09:20:43.030287 master-0 kubenswrapper[7457]: E0319 09:20:43.030169 7457 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:43.030287 master-0 kubenswrapper[7457]: E0319 09:20:43.030257 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca podName:a43e1754-66ff-49c4-8e64-65be7bae2819 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:51.030239288 +0000 UTC m=+46.885578658 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca") pod "route-controller-manager-86584b59c9-tws4x" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819") : configmap "client-ca" not found Mar 19 09:20:45.355322 master-0 kubenswrapper[7457]: I0319 09:20:45.354882 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:45.356606 master-0 kubenswrapper[7457]: I0319 09:20:45.355434 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:45.356606 master-0 kubenswrapper[7457]: E0319 09:20:45.355514 7457 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:45.356606 master-0 kubenswrapper[7457]: E0319 09:20:45.355607 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca podName:c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:53.355592646 +0000 UTC m=+49.210932016 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca") pod "controller-manager-59bcdf994c-hgrrg" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28") : configmap "client-ca" not found Mar 19 09:20:45.368953 master-0 kubenswrapper[7457]: I0319 09:20:45.368895 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:45.449965 master-0 kubenswrapper[7457]: I0319 09:20:45.449916 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47"] Mar 19 09:20:45.450623 master-0 kubenswrapper[7457]: I0319 09:20:45.450595 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.452731 master-0 kubenswrapper[7457]: I0319 09:20:45.452694 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:20:45.452955 master-0 kubenswrapper[7457]: I0319 09:20:45.452935 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:20:45.453213 master-0 kubenswrapper[7457]: I0319 09:20:45.453185 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:20:45.453517 master-0 kubenswrapper[7457]: I0319 09:20:45.453485 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:20:45.453517 master-0 kubenswrapper[7457]: I0319 09:20:45.453506 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:20:45.453746 master-0 kubenswrapper[7457]: I0319 09:20:45.453727 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:20:45.453815 master-0 kubenswrapper[7457]: I0319 09:20:45.453793 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:20:45.453858 master-0 kubenswrapper[7457]: I0319 09:20:45.453735 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:20:45.456068 master-0 kubenswrapper[7457]: I0319 09:20:45.456022 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-audit-policies\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.456150 master-0 kubenswrapper[7457]: I0319 09:20:45.456073 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.456150 master-0 kubenswrapper[7457]: I0319 09:20:45.456098 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-encryption-config\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.456212 master-0 kubenswrapper[7457]: I0319 09:20:45.456159 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a51c701-7f2a-4332-a301-746e8a0eb475-audit-dir\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.456240 master-0 kubenswrapper[7457]: I0319 09:20:45.456224 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-etcd-serving-ca\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.456428 master-0 kubenswrapper[7457]: I0319 09:20:45.456283 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-etcd-client\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.456428 master-0 kubenswrapper[7457]: I0319 09:20:45.456385 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-trusted-ca-bundle\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.456428 master-0 kubenswrapper[7457]: I0319 09:20:45.456421 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7ppn\" (UniqueName: \"kubernetes.io/projected/5a51c701-7f2a-4332-a301-746e8a0eb475-kube-api-access-g7ppn\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.469535 master-0 kubenswrapper[7457]: I0319 09:20:45.469477 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47"] Mar 19 09:20:45.557018 master-0 kubenswrapper[7457]: I0319 09:20:45.556964 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-audit-policies\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.557018 master-0 kubenswrapper[7457]: I0319 09:20:45.557007 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.557251 master-0 kubenswrapper[7457]: E0319 09:20:45.557102 7457 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:45.557251 master-0 kubenswrapper[7457]: I0319 09:20:45.557153 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-encryption-config\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.557251 master-0 kubenswrapper[7457]: E0319 09:20:45.557212 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert podName:5a51c701-7f2a-4332-a301-746e8a0eb475 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:46.057184349 +0000 UTC m=+41.912523719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert") pod "apiserver-57c47bdf6-d9h47" (UID: "5a51c701-7f2a-4332-a301-746e8a0eb475") : secret "serving-cert" not found Mar 19 09:20:45.557337 master-0 kubenswrapper[7457]: I0319 09:20:45.557311 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a51c701-7f2a-4332-a301-746e8a0eb475-audit-dir\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.557372 master-0 kubenswrapper[7457]: I0319 09:20:45.557340 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-etcd-serving-ca\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.557372 master-0 kubenswrapper[7457]: I0319 09:20:45.557366 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-etcd-client\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.557647 master-0 kubenswrapper[7457]: I0319 09:20:45.557621 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-trusted-ca-bundle\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.557754 master-0 kubenswrapper[7457]: I0319 09:20:45.557740 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7ppn\" (UniqueName: \"kubernetes.io/projected/5a51c701-7f2a-4332-a301-746e8a0eb475-kube-api-access-g7ppn\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.557824 master-0 kubenswrapper[7457]: I0319 09:20:45.557785 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-audit-policies\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.557884 master-0 kubenswrapper[7457]: I0319 09:20:45.557823 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a51c701-7f2a-4332-a301-746e8a0eb475-audit-dir\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.558182 master-0 kubenswrapper[7457]: I0319 09:20:45.558142 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-etcd-serving-ca\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.558847 master-0 kubenswrapper[7457]: I0319 09:20:45.558814 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-trusted-ca-bundle\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.560444 master-0 kubenswrapper[7457]: I0319 09:20:45.560396 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-encryption-config\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.561065 master-0 kubenswrapper[7457]: I0319 09:20:45.561036 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-etcd-client\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:45.585297 master-0 kubenswrapper[7457]: I0319 09:20:45.585246 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7ppn\" (UniqueName: \"kubernetes.io/projected/5a51c701-7f2a-4332-a301-746e8a0eb475-kube-api-access-g7ppn\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:46.062426 master-0 kubenswrapper[7457]: I0319 09:20:46.062027 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:46.062640 master-0 kubenswrapper[7457]: E0319 09:20:46.062230 7457 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:46.062729 master-0 kubenswrapper[7457]: E0319 09:20:46.062657 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert podName:5a51c701-7f2a-4332-a301-746e8a0eb475 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.062635641 +0000 UTC m=+42.917975011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert") pod "apiserver-57c47bdf6-d9h47" (UID: "5a51c701-7f2a-4332-a301-746e8a0eb475") : secret "serving-cert" not found Mar 19 09:20:47.073787 master-0 kubenswrapper[7457]: I0319 09:20:47.073722 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:47.074898 master-0 kubenswrapper[7457]: E0319 09:20:47.073832 7457 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:47.074898 master-0 kubenswrapper[7457]: E0319 09:20:47.073900 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert podName:5a51c701-7f2a-4332-a301-746e8a0eb475 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:49.073880693 +0000 UTC m=+44.929220063 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert") pod "apiserver-57c47bdf6-d9h47" (UID: "5a51c701-7f2a-4332-a301-746e8a0eb475") : secret "serving-cert" not found Mar 19 09:20:47.092802 master-0 kubenswrapper[7457]: I0319 09:20:47.092751 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 09:20:47.093332 master-0 kubenswrapper[7457]: I0319 09:20:47.093304 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.095268 master-0 kubenswrapper[7457]: I0319 09:20:47.095232 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 09:20:47.121035 master-0 kubenswrapper[7457]: I0319 09:20:47.120637 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 09:20:47.176551 master-0 kubenswrapper[7457]: I0319 09:20:47.176487 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06b63f3f-ca62-4195-80e9-7ff427e1c58b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.176551 master-0 kubenswrapper[7457]: I0319 09:20:47.176595 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/06b63f3f-ca62-4195-80e9-7ff427e1c58b-var-lock\") pod \"installer-1-master-0\" (UID: \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.176833 master-0 kubenswrapper[7457]: I0319 09:20:47.176617 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06b63f3f-ca62-4195-80e9-7ff427e1c58b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.277637 master-0 kubenswrapper[7457]: I0319 09:20:47.277572 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/06b63f3f-ca62-4195-80e9-7ff427e1c58b-var-lock\") pod \"installer-1-master-0\" (UID: \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.277637 master-0 kubenswrapper[7457]: I0319 09:20:47.277629 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06b63f3f-ca62-4195-80e9-7ff427e1c58b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.277877 master-0 kubenswrapper[7457]: I0319 09:20:47.277714 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/06b63f3f-ca62-4195-80e9-7ff427e1c58b-var-lock\") pod \"installer-1-master-0\" (UID: \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.278003 master-0 kubenswrapper[7457]: I0319 09:20:47.277969 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06b63f3f-ca62-4195-80e9-7ff427e1c58b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.278098 master-0 kubenswrapper[7457]: I0319 09:20:47.278072 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06b63f3f-ca62-4195-80e9-7ff427e1c58b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.496226 master-0 kubenswrapper[7457]: I0319 09:20:47.496092 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06b63f3f-ca62-4195-80e9-7ff427e1c58b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.639343 master-0 kubenswrapper[7457]: I0319 09:20:47.639292 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f79cb767-hd66w"] Mar 19 09:20:47.640066 master-0 kubenswrapper[7457]: I0319 09:20:47.640042 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.643119 master-0 kubenswrapper[7457]: I0319 09:20:47.643051 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Mar 19 09:20:47.643212 master-0 kubenswrapper[7457]: I0319 09:20:47.643062 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:20:47.643212 master-0 kubenswrapper[7457]: I0319 09:20:47.643188 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:20:47.644313 master-0 kubenswrapper[7457]: I0319 09:20:47.644271 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:20:47.644313 master-0 kubenswrapper[7457]: I0319 09:20:47.644303 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:20:47.644404 master-0 kubenswrapper[7457]: I0319 09:20:47.644330 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Mar 19 09:20:47.649078 master-0 kubenswrapper[7457]: I0319 09:20:47.649044 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:20:47.649528 master-0 kubenswrapper[7457]: I0319 09:20:47.649498 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:20:47.649603 master-0 kubenswrapper[7457]: I0319 09:20:47.649548 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:20:47.656975 master-0 kubenswrapper[7457]: I0319 09:20:47.656890 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:20:47.682885 master-0 kubenswrapper[7457]: I0319 09:20:47.682821 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-etcd-client\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.682885 master-0 kubenswrapper[7457]: I0319 09:20:47.682873 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-encryption-config\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.682885 master-0 kubenswrapper[7457]: I0319 09:20:47.682903 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-trusted-ca-bundle\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.683350 master-0 kubenswrapper[7457]: I0319 09:20:47.682977 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.683350 master-0 kubenswrapper[7457]: I0319 09:20:47.682996 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-etcd-serving-ca\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.683350 master-0 kubenswrapper[7457]: I0319 09:20:47.683071 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5aa38084-8254-4c8e-9b84-b9eee680c329-node-pullsecrets\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.683350 master-0 kubenswrapper[7457]: I0319 09:20:47.683131 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw548\" (UniqueName: \"kubernetes.io/projected/5aa38084-8254-4c8e-9b84-b9eee680c329-kube-api-access-vw548\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.683350 master-0 kubenswrapper[7457]: I0319 09:20:47.683157 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-config\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.683350 master-0 kubenswrapper[7457]: I0319 09:20:47.683261 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.683350 master-0 kubenswrapper[7457]: I0319 09:20:47.683296 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5aa38084-8254-4c8e-9b84-b9eee680c329-audit-dir\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.683350 master-0 kubenswrapper[7457]: I0319 09:20:47.683351 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-image-import-ca\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.704938 master-0 kubenswrapper[7457]: I0319 09:20:47.702897 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f79cb767-hd66w"] Mar 19 09:20:47.707704 master-0 kubenswrapper[7457]: I0319 09:20:47.707658 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.784864 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw548\" (UniqueName: \"kubernetes.io/projected/5aa38084-8254-4c8e-9b84-b9eee680c329-kube-api-access-vw548\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.784918 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-config\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.785293 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.785321 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5aa38084-8254-4c8e-9b84-b9eee680c329-audit-dir\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.785628 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5aa38084-8254-4c8e-9b84-b9eee680c329-audit-dir\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.785625 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-image-import-ca\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.785719 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-etcd-client\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: E0319 09:20:47.785738 7457 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: E0319 09:20:47.785791 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert podName:5aa38084-8254-4c8e-9b84-b9eee680c329 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:48.285773122 +0000 UTC m=+44.141112592 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert") pod "apiserver-76f79cb767-hd66w" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329") : secret "serving-cert" not found Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.785811 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-encryption-config\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.785851 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-trusted-ca-bundle\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.786097 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-config\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.786413 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.786454 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-image-import-ca\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.786464 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-etcd-serving-ca\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: E0319 09:20:47.786578 7457 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: E0319 09:20:47.786641 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit podName:5aa38084-8254-4c8e-9b84-b9eee680c329 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:48.286616524 +0000 UTC m=+44.141955894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit") pod "apiserver-76f79cb767-hd66w" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329") : configmap "audit-0" not found Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.786668 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5aa38084-8254-4c8e-9b84-b9eee680c329-node-pullsecrets\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.786774 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5aa38084-8254-4c8e-9b84-b9eee680c329-node-pullsecrets\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.787379 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-etcd-serving-ca\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.788501 master-0 kubenswrapper[7457]: I0319 09:20:47.787800 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-trusted-ca-bundle\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.793454 master-0 kubenswrapper[7457]: I0319 09:20:47.791221 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-encryption-config\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.793454 master-0 kubenswrapper[7457]: I0319 09:20:47.791313 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-etcd-client\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.853071 master-0 kubenswrapper[7457]: I0319 09:20:47.847509 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw548\" (UniqueName: \"kubernetes.io/projected/5aa38084-8254-4c8e-9b84-b9eee680c329-kube-api-access-vw548\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:47.938371 master-0 kubenswrapper[7457]: I0319 09:20:47.938295 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 09:20:47.948104 master-0 kubenswrapper[7457]: W0319 09:20:47.948018 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod06b63f3f_ca62_4195_80e9_7ff427e1c58b.slice/crio-238008c11c92aeb65c62f807e37180e81b3384c3527719ecefddec5721f48b97 WatchSource:0}: Error finding container 238008c11c92aeb65c62f807e37180e81b3384c3527719ecefddec5721f48b97: Status 404 returned error can't find the container with id 238008c11c92aeb65c62f807e37180e81b3384c3527719ecefddec5721f48b97 Mar 19 09:20:48.293567 master-0 kubenswrapper[7457]: I0319 09:20:48.293482 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:48.294416 master-0 kubenswrapper[7457]: E0319 09:20:48.293645 7457 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 09:20:48.294416 master-0 kubenswrapper[7457]: I0319 09:20:48.293738 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:48.294416 master-0 kubenswrapper[7457]: E0319 09:20:48.293838 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit podName:5aa38084-8254-4c8e-9b84-b9eee680c329 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:49.29379226 +0000 UTC m=+45.149131640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit") pod "apiserver-76f79cb767-hd66w" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329") : configmap "audit-0" not found Mar 19 09:20:48.294416 master-0 kubenswrapper[7457]: E0319 09:20:48.293856 7457 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:48.294416 master-0 kubenswrapper[7457]: E0319 09:20:48.293949 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert podName:5aa38084-8254-4c8e-9b84-b9eee680c329 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:49.293933824 +0000 UTC m=+45.149273274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert") pod "apiserver-76f79cb767-hd66w" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329") : secret "serving-cert" not found Mar 19 09:20:48.596064 master-0 kubenswrapper[7457]: I0319 09:20:48.595994 7457 generic.go:334] "Generic (PLEG): container finished" podID="3b50118d-f7c2-4bff-aca0-5c6623819baf" containerID="4be81bb4984289b1445cac9a7d29a8575166e1227c0a164f03ad826b2adf5846" exitCode=0 Mar 19 09:20:48.596264 master-0 kubenswrapper[7457]: I0319 09:20:48.596082 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" event={"ID":"3b50118d-f7c2-4bff-aca0-5c6623819baf","Type":"ContainerDied","Data":"4be81bb4984289b1445cac9a7d29a8575166e1227c0a164f03ad826b2adf5846"} Mar 19 09:20:48.596583 master-0 kubenswrapper[7457]: I0319 09:20:48.596535 7457 scope.go:117] "RemoveContainer" containerID="4be81bb4984289b1445cac9a7d29a8575166e1227c0a164f03ad826b2adf5846" Mar 19 09:20:48.597637 master-0 kubenswrapper[7457]: I0319 09:20:48.597579 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"06b63f3f-ca62-4195-80e9-7ff427e1c58b","Type":"ContainerStarted","Data":"724d8a9b85240c0b1df62f7319e7755ef432c021c652343b5814cdc6b0afd1ef"} Mar 19 09:20:48.597637 master-0 kubenswrapper[7457]: I0319 09:20:48.597603 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"06b63f3f-ca62-4195-80e9-7ff427e1c58b","Type":"ContainerStarted","Data":"238008c11c92aeb65c62f807e37180e81b3384c3527719ecefddec5721f48b97"} Mar 19 09:20:48.633879 master-0 kubenswrapper[7457]: I0319 09:20:48.632690 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=2.632671211 podStartE2EDuration="2.632671211s" podCreationTimestamp="2026-03-19 09:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:20:48.631941933 +0000 UTC m=+44.487281303" watchObservedRunningTime="2026-03-19 09:20:48.632671211 +0000 UTC m=+44.488010611" Mar 19 09:20:49.105100 master-0 kubenswrapper[7457]: I0319 09:20:49.105017 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:49.105347 master-0 kubenswrapper[7457]: E0319 09:20:49.105242 7457 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:49.105347 master-0 kubenswrapper[7457]: E0319 09:20:49.105327 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert podName:5a51c701-7f2a-4332-a301-746e8a0eb475 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:53.105306817 +0000 UTC m=+48.960646247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert") pod "apiserver-57c47bdf6-d9h47" (UID: "5a51c701-7f2a-4332-a301-746e8a0eb475") : secret "serving-cert" not found Mar 19 09:20:49.306868 master-0 kubenswrapper[7457]: I0319 09:20:49.306798 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:49.307483 master-0 kubenswrapper[7457]: E0319 09:20:49.307046 7457 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:49.307483 master-0 kubenswrapper[7457]: E0319 09:20:49.307132 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert podName:5aa38084-8254-4c8e-9b84-b9eee680c329 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:51.307110496 +0000 UTC m=+47.162449946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert") pod "apiserver-76f79cb767-hd66w" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329") : secret "serving-cert" not found Mar 19 09:20:49.307483 master-0 kubenswrapper[7457]: I0319 09:20:49.307129 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:49.307483 master-0 kubenswrapper[7457]: E0319 09:20:49.307272 7457 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 09:20:49.307483 master-0 kubenswrapper[7457]: E0319 09:20:49.307327 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit podName:5aa38084-8254-4c8e-9b84-b9eee680c329 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:51.307311011 +0000 UTC m=+47.162650451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit") pod "apiserver-76f79cb767-hd66w" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329") : configmap "audit-0" not found Mar 19 09:20:49.454019 master-0 kubenswrapper[7457]: I0319 09:20:49.453509 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 19 09:20:49.454498 master-0 kubenswrapper[7457]: I0319 09:20:49.454475 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 09:20:49.456667 master-0 kubenswrapper[7457]: I0319 09:20:49.456562 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 19 09:20:49.462377 master-0 kubenswrapper[7457]: I0319 09:20:49.462322 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 19 09:20:49.510371 master-0 kubenswrapper[7457]: I0319 09:20:49.510255 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:20:49.510788 master-0 kubenswrapper[7457]: I0319 09:20:49.510432 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-kube-api-access\") pod \"installer-1-master-0\" (UID: \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:20:49.510788 master-0 kubenswrapper[7457]: I0319 09:20:49.510638 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-var-lock\") pod \"installer-1-master-0\" (UID: \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:20:49.605947 master-0 kubenswrapper[7457]: I0319 09:20:49.605892 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" event={"ID":"3b50118d-f7c2-4bff-aca0-5c6623819baf","Type":"ContainerStarted","Data":"f76bcee5a7887b5ce41895da9deeb005bff3bd9491b1cae94603ecd2c30da8a6"} Mar 19 09:20:49.612387 master-0 kubenswrapper[7457]: I0319 09:20:49.612247 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:20:49.612514 master-0 kubenswrapper[7457]: I0319 09:20:49.612500 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-kube-api-access\") pod \"installer-1-master-0\" (UID: \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:20:49.613037 master-0 kubenswrapper[7457]: I0319 09:20:49.612972 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:20:49.613218 master-0 kubenswrapper[7457]: I0319 09:20:49.613165 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-var-lock\") pod \"installer-1-master-0\" (UID: \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:20:49.613295 master-0 kubenswrapper[7457]: I0319 09:20:49.613274 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-var-lock\") pod \"installer-1-master-0\" (UID: \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:20:49.946828 master-0 kubenswrapper[7457]: I0319 09:20:49.946221 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-kube-api-access\") pod \"installer-1-master-0\" (UID: \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:20:50.068876 master-0 kubenswrapper[7457]: I0319 09:20:50.068809 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 09:20:50.626164 master-0 kubenswrapper[7457]: I0319 09:20:50.626101 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 19 09:20:50.641480 master-0 kubenswrapper[7457]: W0319 09:20:50.641435 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc5e3b99a_24af_42a0_bf5f_d82b91ecbc6a.slice/crio-5102e60e07b0c0187e422871eca34d56a4b64890354534ac5bb4405ed5a663d3 WatchSource:0}: Error finding container 5102e60e07b0c0187e422871eca34d56a4b64890354534ac5bb4405ed5a663d3: Status 404 returned error can't find the container with id 5102e60e07b0c0187e422871eca34d56a4b64890354534ac5bb4405ed5a663d3 Mar 19 09:20:50.738968 master-0 kubenswrapper[7457]: I0319 09:20:50.738655 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-76f79cb767-hd66w"] Mar 19 09:20:50.738968 master-0 kubenswrapper[7457]: E0319 09:20:50.738900 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-76f79cb767-hd66w" podUID="5aa38084-8254-4c8e-9b84-b9eee680c329" Mar 19 09:20:51.030824 master-0 kubenswrapper[7457]: I0319 09:20:51.030771 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:51.031017 master-0 kubenswrapper[7457]: E0319 09:20:51.030928 7457 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:51.031017 master-0 kubenswrapper[7457]: E0319 09:20:51.030983 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert podName:a43e1754-66ff-49c4-8e64-65be7bae2819 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:07.030966627 +0000 UTC m=+62.886305997 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert") pod "route-controller-manager-86584b59c9-tws4x" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819") : secret "serving-cert" not found Mar 19 09:20:51.031106 master-0 kubenswrapper[7457]: I0319 09:20:51.031014 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca\") pod \"route-controller-manager-86584b59c9-tws4x\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:51.031248 master-0 kubenswrapper[7457]: E0319 09:20:51.031192 7457 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:51.031300 master-0 kubenswrapper[7457]: E0319 09:20:51.031284 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca podName:a43e1754-66ff-49c4-8e64-65be7bae2819 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:07.031262684 +0000 UTC m=+62.886602054 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca") pod "route-controller-manager-86584b59c9-tws4x" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819") : configmap "client-ca" not found Mar 19 09:20:51.334797 master-0 kubenswrapper[7457]: I0319 09:20:51.334692 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:51.334797 master-0 kubenswrapper[7457]: I0319 09:20:51.334797 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit\") pod \"apiserver-76f79cb767-hd66w\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:51.335087 master-0 kubenswrapper[7457]: E0319 09:20:51.334876 7457 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 09:20:51.335087 master-0 kubenswrapper[7457]: E0319 09:20:51.334919 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit podName:5aa38084-8254-4c8e-9b84-b9eee680c329 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:55.334906037 +0000 UTC m=+51.190245407 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit") pod "apiserver-76f79cb767-hd66w" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329") : configmap "audit-0" not found Mar 19 09:20:51.335190 master-0 kubenswrapper[7457]: E0319 09:20:51.335097 7457 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:51.335190 master-0 kubenswrapper[7457]: E0319 09:20:51.335188 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert podName:5aa38084-8254-4c8e-9b84-b9eee680c329 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:55.335170664 +0000 UTC m=+51.190510034 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert") pod "apiserver-76f79cb767-hd66w" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329") : secret "serving-cert" not found Mar 19 09:20:51.622508 master-0 kubenswrapper[7457]: I0319 09:20:51.622337 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a","Type":"ContainerStarted","Data":"301aebb0e9930fecf725f0201f719e1159eb2c1c4f88b41cf02dfb10a0bbec0d"} Mar 19 09:20:51.622508 master-0 kubenswrapper[7457]: I0319 09:20:51.622364 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:51.622508 master-0 kubenswrapper[7457]: I0319 09:20:51.622374 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a","Type":"ContainerStarted","Data":"5102e60e07b0c0187e422871eca34d56a4b64890354534ac5bb4405ed5a663d3"} Mar 19 09:20:51.628722 master-0 kubenswrapper[7457]: I0319 09:20:51.628695 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:51.638240 master-0 kubenswrapper[7457]: I0319 09:20:51.638167 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=2.63815067 podStartE2EDuration="2.63815067s" podCreationTimestamp="2026-03-19 09:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:20:51.637834421 +0000 UTC m=+47.493173791" watchObservedRunningTime="2026-03-19 09:20:51.63815067 +0000 UTC m=+47.493490040" Mar 19 09:20:51.739102 master-0 kubenswrapper[7457]: I0319 09:20:51.739032 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-encryption-config\") pod \"5aa38084-8254-4c8e-9b84-b9eee680c329\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " Mar 19 09:20:51.739102 master-0 kubenswrapper[7457]: I0319 09:20:51.739089 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-etcd-serving-ca\") pod \"5aa38084-8254-4c8e-9b84-b9eee680c329\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " Mar 19 09:20:51.739354 master-0 kubenswrapper[7457]: I0319 09:20:51.739147 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5aa38084-8254-4c8e-9b84-b9eee680c329-node-pullsecrets\") pod \"5aa38084-8254-4c8e-9b84-b9eee680c329\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " Mar 19 09:20:51.739354 master-0 kubenswrapper[7457]: I0319 09:20:51.739180 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-config\") pod \"5aa38084-8254-4c8e-9b84-b9eee680c329\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " Mar 19 09:20:51.739354 master-0 kubenswrapper[7457]: I0319 09:20:51.739204 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw548\" (UniqueName: \"kubernetes.io/projected/5aa38084-8254-4c8e-9b84-b9eee680c329-kube-api-access-vw548\") pod \"5aa38084-8254-4c8e-9b84-b9eee680c329\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " Mar 19 09:20:51.739354 master-0 kubenswrapper[7457]: I0319 09:20:51.739232 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5aa38084-8254-4c8e-9b84-b9eee680c329-audit-dir\") pod \"5aa38084-8254-4c8e-9b84-b9eee680c329\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " Mar 19 09:20:51.739354 master-0 kubenswrapper[7457]: I0319 09:20:51.739278 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-etcd-client\") pod \"5aa38084-8254-4c8e-9b84-b9eee680c329\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " Mar 19 09:20:51.739354 master-0 kubenswrapper[7457]: I0319 09:20:51.739271 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5aa38084-8254-4c8e-9b84-b9eee680c329-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "5aa38084-8254-4c8e-9b84-b9eee680c329" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:20:51.739354 master-0 kubenswrapper[7457]: I0319 09:20:51.739302 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-image-import-ca\") pod \"5aa38084-8254-4c8e-9b84-b9eee680c329\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " Mar 19 09:20:51.739600 master-0 kubenswrapper[7457]: I0319 09:20:51.739392 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-trusted-ca-bundle\") pod \"5aa38084-8254-4c8e-9b84-b9eee680c329\" (UID: \"5aa38084-8254-4c8e-9b84-b9eee680c329\") " Mar 19 09:20:51.739600 master-0 kubenswrapper[7457]: I0319 09:20:51.739429 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5aa38084-8254-4c8e-9b84-b9eee680c329-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5aa38084-8254-4c8e-9b84-b9eee680c329" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:20:51.740001 master-0 kubenswrapper[7457]: I0319 09:20:51.739931 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "5aa38084-8254-4c8e-9b84-b9eee680c329" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:51.740001 master-0 kubenswrapper[7457]: I0319 09:20:51.739981 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "5aa38084-8254-4c8e-9b84-b9eee680c329" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:51.740102 master-0 kubenswrapper[7457]: I0319 09:20:51.739988 7457 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5aa38084-8254-4c8e-9b84-b9eee680c329-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:51.740102 master-0 kubenswrapper[7457]: I0319 09:20:51.740037 7457 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5aa38084-8254-4c8e-9b84-b9eee680c329-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:51.740174 master-0 kubenswrapper[7457]: I0319 09:20:51.740123 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5aa38084-8254-4c8e-9b84-b9eee680c329" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:51.740174 master-0 kubenswrapper[7457]: I0319 09:20:51.740142 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-config" (OuterVolumeSpecName: "config") pod "5aa38084-8254-4c8e-9b84-b9eee680c329" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:51.744730 master-0 kubenswrapper[7457]: I0319 09:20:51.744693 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "5aa38084-8254-4c8e-9b84-b9eee680c329" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:20:51.744825 master-0 kubenswrapper[7457]: I0319 09:20:51.744731 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "5aa38084-8254-4c8e-9b84-b9eee680c329" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:20:51.744975 master-0 kubenswrapper[7457]: I0319 09:20:51.744932 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aa38084-8254-4c8e-9b84-b9eee680c329-kube-api-access-vw548" (OuterVolumeSpecName: "kube-api-access-vw548") pod "5aa38084-8254-4c8e-9b84-b9eee680c329" (UID: "5aa38084-8254-4c8e-9b84-b9eee680c329"). InnerVolumeSpecName "kube-api-access-vw548". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:20:51.841006 master-0 kubenswrapper[7457]: I0319 09:20:51.840921 7457 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:51.841006 master-0 kubenswrapper[7457]: I0319 09:20:51.840986 7457 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:51.841006 master-0 kubenswrapper[7457]: I0319 09:20:51.840998 7457 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:51.841006 master-0 kubenswrapper[7457]: I0319 09:20:51.841012 7457 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:51.841287 master-0 kubenswrapper[7457]: I0319 09:20:51.841027 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw548\" (UniqueName: \"kubernetes.io/projected/5aa38084-8254-4c8e-9b84-b9eee680c329-kube-api-access-vw548\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:51.841287 master-0 kubenswrapper[7457]: I0319 09:20:51.841040 7457 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:51.841287 master-0 kubenswrapper[7457]: I0319 09:20:51.841052 7457 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:52.626085 master-0 kubenswrapper[7457]: I0319 09:20:52.625944 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f79cb767-hd66w" Mar 19 09:20:52.665827 master-0 kubenswrapper[7457]: I0319 09:20:52.665069 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-7dcf67dd86-6hgld"] Mar 19 09:20:52.666667 master-0 kubenswrapper[7457]: I0319 09:20:52.665966 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.674625 master-0 kubenswrapper[7457]: I0319 09:20:52.674033 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:20:52.674625 master-0 kubenswrapper[7457]: I0319 09:20:52.674107 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:20:52.674625 master-0 kubenswrapper[7457]: I0319 09:20:52.674423 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 09:20:52.674625 master-0 kubenswrapper[7457]: I0319 09:20:52.674435 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:20:52.674625 master-0 kubenswrapper[7457]: I0319 09:20:52.674566 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:20:52.674625 master-0 kubenswrapper[7457]: I0319 09:20:52.674568 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:20:52.675021 master-0 kubenswrapper[7457]: I0319 09:20:52.674807 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 09:20:52.675021 master-0 kubenswrapper[7457]: I0319 09:20:52.674895 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:20:52.675021 master-0 kubenswrapper[7457]: I0319 09:20:52.675010 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:20:52.677827 master-0 kubenswrapper[7457]: I0319 09:20:52.677775 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:20:52.679579 master-0 kubenswrapper[7457]: I0319 09:20:52.679502 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-76f79cb767-hd66w"] Mar 19 09:20:52.689669 master-0 kubenswrapper[7457]: I0319 09:20:52.689629 7457 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-76f79cb767-hd66w"] Mar 19 09:20:52.691373 master-0 kubenswrapper[7457]: I0319 09:20:52.691131 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-7dcf67dd86-6hgld"] Mar 19 09:20:52.756545 master-0 kubenswrapper[7457]: I0319 09:20:52.756449 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-image-import-ca\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.756545 master-0 kubenswrapper[7457]: I0319 09:20:52.756498 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-config\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.756822 master-0 kubenswrapper[7457]: I0319 09:20:52.756599 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf5jl\" (UniqueName: \"kubernetes.io/projected/64f60856-22dd-4560-acff-c620e17844a1-kube-api-access-cf5jl\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.756822 master-0 kubenswrapper[7457]: I0319 09:20:52.756657 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-etcd-serving-ca\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.756822 master-0 kubenswrapper[7457]: I0319 09:20:52.756678 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-audit\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.756822 master-0 kubenswrapper[7457]: I0319 09:20:52.756753 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-encryption-config\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.756822 master-0 kubenswrapper[7457]: I0319 09:20:52.756817 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-trusted-ca-bundle\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.757035 master-0 kubenswrapper[7457]: I0319 09:20:52.756897 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.757035 master-0 kubenswrapper[7457]: I0319 09:20:52.756980 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64f60856-22dd-4560-acff-c620e17844a1-node-pullsecrets\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.757106 master-0 kubenswrapper[7457]: I0319 09:20:52.757062 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64f60856-22dd-4560-acff-c620e17844a1-audit-dir\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.757106 master-0 kubenswrapper[7457]: I0319 09:20:52.757091 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-etcd-client\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.757161 master-0 kubenswrapper[7457]: I0319 09:20:52.757145 7457 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5aa38084-8254-4c8e-9b84-b9eee680c329-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:52.757161 master-0 kubenswrapper[7457]: I0319 09:20:52.757158 7457 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5aa38084-8254-4c8e-9b84-b9eee680c329-audit\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:52.858336 master-0 kubenswrapper[7457]: I0319 09:20:52.858278 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-etcd-serving-ca\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.858635 master-0 kubenswrapper[7457]: I0319 09:20:52.858615 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-audit\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.858754 master-0 kubenswrapper[7457]: I0319 09:20:52.858737 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-encryption-config\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.858898 master-0 kubenswrapper[7457]: I0319 09:20:52.858881 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-trusted-ca-bundle\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.859020 master-0 kubenswrapper[7457]: I0319 09:20:52.859004 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.859414 master-0 kubenswrapper[7457]: I0319 09:20:52.859372 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64f60856-22dd-4560-acff-c620e17844a1-node-pullsecrets\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.859510 master-0 kubenswrapper[7457]: I0319 09:20:52.859487 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64f60856-22dd-4560-acff-c620e17844a1-audit-dir\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.859689 master-0 kubenswrapper[7457]: I0319 09:20:52.859642 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64f60856-22dd-4560-acff-c620e17844a1-node-pullsecrets\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.859743 master-0 kubenswrapper[7457]: I0319 09:20:52.859689 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-etcd-client\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.859823 master-0 kubenswrapper[7457]: I0319 09:20:52.859800 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-image-import-ca\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.859877 master-0 kubenswrapper[7457]: I0319 09:20:52.859837 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-config\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.859916 master-0 kubenswrapper[7457]: I0319 09:20:52.859887 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf5jl\" (UniqueName: \"kubernetes.io/projected/64f60856-22dd-4560-acff-c620e17844a1-kube-api-access-cf5jl\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.859962 master-0 kubenswrapper[7457]: I0319 09:20:52.859935 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-audit\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.860016 master-0 kubenswrapper[7457]: E0319 09:20:52.859655 7457 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:52.860056 master-0 kubenswrapper[7457]: E0319 09:20:52.860030 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert podName:64f60856-22dd-4560-acff-c620e17844a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:53.360009188 +0000 UTC m=+49.215348558 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert") pod "apiserver-7dcf67dd86-6hgld" (UID: "64f60856-22dd-4560-acff-c620e17844a1") : secret "serving-cert" not found Mar 19 09:20:52.860157 master-0 kubenswrapper[7457]: I0319 09:20:52.860135 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64f60856-22dd-4560-acff-c620e17844a1-audit-dir\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.861976 master-0 kubenswrapper[7457]: I0319 09:20:52.860523 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-trusted-ca-bundle\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.861976 master-0 kubenswrapper[7457]: I0319 09:20:52.860619 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-image-import-ca\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.861976 master-0 kubenswrapper[7457]: I0319 09:20:52.860791 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-config\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.861976 master-0 kubenswrapper[7457]: I0319 09:20:52.860945 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-etcd-serving-ca\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.864397 master-0 kubenswrapper[7457]: I0319 09:20:52.864328 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-encryption-config\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.866760 master-0 kubenswrapper[7457]: I0319 09:20:52.866704 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-etcd-client\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:52.879935 master-0 kubenswrapper[7457]: I0319 09:20:52.879813 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf5jl\" (UniqueName: \"kubernetes.io/projected/64f60856-22dd-4560-acff-c620e17844a1-kube-api-access-cf5jl\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:53.163060 master-0 kubenswrapper[7457]: I0319 09:20:53.162915 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:53.165635 master-0 kubenswrapper[7457]: I0319 09:20:53.165608 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:53.271889 master-0 kubenswrapper[7457]: I0319 09:20:53.271835 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:20:53.367619 master-0 kubenswrapper[7457]: I0319 09:20:53.365363 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:53.367619 master-0 kubenswrapper[7457]: I0319 09:20:53.365942 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca\") pod \"controller-manager-59bcdf994c-hgrrg\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:53.367619 master-0 kubenswrapper[7457]: E0319 09:20:53.365609 7457 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:53.367619 master-0 kubenswrapper[7457]: E0319 09:20:53.366165 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert podName:64f60856-22dd-4560-acff-c620e17844a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:54.366146637 +0000 UTC m=+50.221486007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert") pod "apiserver-7dcf67dd86-6hgld" (UID: "64f60856-22dd-4560-acff-c620e17844a1") : secret "serving-cert" not found Mar 19 09:20:53.367619 master-0 kubenswrapper[7457]: E0319 09:20:53.366102 7457 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:53.367619 master-0 kubenswrapper[7457]: E0319 09:20:53.366538 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca podName:c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:09.366522057 +0000 UTC m=+65.221861427 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca") pod "controller-manager-59bcdf994c-hgrrg" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28") : configmap "client-ca" not found Mar 19 09:20:53.447526 master-0 kubenswrapper[7457]: I0319 09:20:53.443370 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47"] Mar 19 09:20:53.631152 master-0 kubenswrapper[7457]: I0319 09:20:53.631088 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" event={"ID":"5a51c701-7f2a-4332-a301-746e8a0eb475","Type":"ContainerStarted","Data":"2667f8abc3377f4e949ca5efee8caf3d44c08b3911b024266dc76fb9003cb2e0"} Mar 19 09:20:54.340128 master-0 kubenswrapper[7457]: I0319 09:20:54.339887 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aa38084-8254-4c8e-9b84-b9eee680c329" path="/var/lib/kubelet/pods/5aa38084-8254-4c8e-9b84-b9eee680c329/volumes" Mar 19 09:20:54.378983 master-0 kubenswrapper[7457]: I0319 09:20:54.378924 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:54.379200 master-0 kubenswrapper[7457]: E0319 09:20:54.379119 7457 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:54.379271 master-0 kubenswrapper[7457]: E0319 09:20:54.379210 7457 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert podName:64f60856-22dd-4560-acff-c620e17844a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:56.379196756 +0000 UTC m=+52.234536126 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert") pod "apiserver-7dcf67dd86-6hgld" (UID: "64f60856-22dd-4560-acff-c620e17844a1") : secret "serving-cert" not found Mar 19 09:20:56.408283 master-0 kubenswrapper[7457]: I0319 09:20:56.407906 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:56.415356 master-0 kubenswrapper[7457]: I0319 09:20:56.415161 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:56.592424 master-0 kubenswrapper[7457]: I0319 09:20:56.592314 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:20:56.831781 master-0 kubenswrapper[7457]: I0319 09:20:56.830251 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-7dcf67dd86-6hgld"] Mar 19 09:20:57.650729 master-0 kubenswrapper[7457]: I0319 09:20:57.650633 7457 generic.go:334] "Generic (PLEG): container finished" podID="5a51c701-7f2a-4332-a301-746e8a0eb475" containerID="f8a8d2ffc695381746f012239f51a0188a35a78cf69857ab2089866fefe4ec7f" exitCode=0 Mar 19 09:20:57.650729 master-0 kubenswrapper[7457]: I0319 09:20:57.650700 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" event={"ID":"5a51c701-7f2a-4332-a301-746e8a0eb475","Type":"ContainerDied","Data":"f8a8d2ffc695381746f012239f51a0188a35a78cf69857ab2089866fefe4ec7f"} Mar 19 09:20:57.653068 master-0 kubenswrapper[7457]: I0319 09:20:57.653042 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" event={"ID":"64f60856-22dd-4560-acff-c620e17844a1","Type":"ContainerStarted","Data":"50cc5384c1b8bb903ef5215671baf6cb4c6d2ce7a00d389208992a278c3b103c"} Mar 19 09:20:58.471267 master-0 kubenswrapper[7457]: I0319 09:20:58.471174 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 09:20:58.471902 master-0 kubenswrapper[7457]: I0319 09:20:58.471487 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="06b63f3f-ca62-4195-80e9-7ff427e1c58b" containerName="installer" containerID="cri-o://724d8a9b85240c0b1df62f7319e7755ef432c021c652343b5814cdc6b0afd1ef" gracePeriod=30 Mar 19 09:20:58.657928 master-0 kubenswrapper[7457]: I0319 09:20:58.657821 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" event={"ID":"5a51c701-7f2a-4332-a301-746e8a0eb475","Type":"ContainerStarted","Data":"b83e947f356498d2a715c279a2f0ca89df75c48ef41ce2e3445e1a0f837bc5c5"} Mar 19 09:20:58.759877 master-0 kubenswrapper[7457]: I0319 09:20:58.759711 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" podStartSLOduration=10.542814704 podStartE2EDuration="13.759687489s" podCreationTimestamp="2026-03-19 09:20:45 +0000 UTC" firstStartedPulling="2026-03-19 09:20:53.455125639 +0000 UTC m=+49.310465009" lastFinishedPulling="2026-03-19 09:20:56.671998424 +0000 UTC m=+52.527337794" observedRunningTime="2026-03-19 09:20:58.759675918 +0000 UTC m=+54.615015288" watchObservedRunningTime="2026-03-19 09:20:58.759687489 +0000 UTC m=+54.615026859" Mar 19 09:20:58.901871 master-0 kubenswrapper[7457]: I0319 09:20:58.901128 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59bcdf994c-hgrrg"] Mar 19 09:20:58.902066 master-0 kubenswrapper[7457]: E0319 09:20:58.901985 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" podUID="c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28" Mar 19 09:20:59.109481 master-0 kubenswrapper[7457]: I0319 09:20:59.109363 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x"] Mar 19 09:20:59.109759 master-0 kubenswrapper[7457]: E0319 09:20:59.109672 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" podUID="a43e1754-66ff-49c4-8e64-65be7bae2819" Mar 19 09:20:59.662239 master-0 kubenswrapper[7457]: I0319 09:20:59.662145 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:59.662239 master-0 kubenswrapper[7457]: I0319 09:20:59.662184 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:59.678205 master-0 kubenswrapper[7457]: I0319 09:20:59.678152 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:20:59.700019 master-0 kubenswrapper[7457]: I0319 09:20:59.694864 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:20:59.748991 master-0 kubenswrapper[7457]: I0319 09:20:59.748938 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert\") pod \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " Mar 19 09:20:59.748991 master-0 kubenswrapper[7457]: I0319 09:20:59.748996 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx8gb\" (UniqueName: \"kubernetes.io/projected/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-kube-api-access-kx8gb\") pod \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " Mar 19 09:20:59.749223 master-0 kubenswrapper[7457]: I0319 09:20:59.749023 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-config\") pod \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " Mar 19 09:20:59.749223 master-0 kubenswrapper[7457]: I0319 09:20:59.749054 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-config\") pod \"a43e1754-66ff-49c4-8e64-65be7bae2819\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " Mar 19 09:20:59.749223 master-0 kubenswrapper[7457]: I0319 09:20:59.749085 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk9cq\" (UniqueName: \"kubernetes.io/projected/a43e1754-66ff-49c4-8e64-65be7bae2819-kube-api-access-lk9cq\") pod \"a43e1754-66ff-49c4-8e64-65be7bae2819\" (UID: \"a43e1754-66ff-49c4-8e64-65be7bae2819\") " Mar 19 09:20:59.749223 master-0 kubenswrapper[7457]: I0319 09:20:59.749110 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-proxy-ca-bundles\") pod \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\" (UID: \"c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28\") " Mar 19 09:20:59.749772 master-0 kubenswrapper[7457]: I0319 09:20:59.749733 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-config" (OuterVolumeSpecName: "config") pod "a43e1754-66ff-49c4-8e64-65be7bae2819" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:59.749949 master-0 kubenswrapper[7457]: I0319 09:20:59.749888 7457 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:59.750848 master-0 kubenswrapper[7457]: I0319 09:20:59.750812 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:59.750977 master-0 kubenswrapper[7457]: I0319 09:20:59.750943 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-config" (OuterVolumeSpecName: "config") pod "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:59.756830 master-0 kubenswrapper[7457]: I0319 09:20:59.756666 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a43e1754-66ff-49c4-8e64-65be7bae2819-kube-api-access-lk9cq" (OuterVolumeSpecName: "kube-api-access-lk9cq") pod "a43e1754-66ff-49c4-8e64-65be7bae2819" (UID: "a43e1754-66ff-49c4-8e64-65be7bae2819"). InnerVolumeSpecName "kube-api-access-lk9cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:20:59.756830 master-0 kubenswrapper[7457]: I0319 09:20:59.756728 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-kube-api-access-kx8gb" (OuterVolumeSpecName: "kube-api-access-kx8gb") pod "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28"). InnerVolumeSpecName "kube-api-access-kx8gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:20:59.769785 master-0 kubenswrapper[7457]: I0319 09:20:59.769007 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28" (UID: "c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:20:59.830207 master-0 kubenswrapper[7457]: I0319 09:20:59.830151 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt"] Mar 19 09:20:59.830973 master-0 kubenswrapper[7457]: I0319 09:20:59.830914 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:20:59.834721 master-0 kubenswrapper[7457]: I0319 09:20:59.834647 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 09:20:59.835029 master-0 kubenswrapper[7457]: I0319 09:20:59.835005 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 09:20:59.840143 master-0 kubenswrapper[7457]: I0319 09:20:59.839787 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 09:20:59.863560 master-0 kubenswrapper[7457]: I0319 09:20:59.861105 7457 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:59.863560 master-0 kubenswrapper[7457]: I0319 09:20:59.861193 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx8gb\" (UniqueName: \"kubernetes.io/projected/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-kube-api-access-kx8gb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:59.863560 master-0 kubenswrapper[7457]: I0319 09:20:59.861207 7457 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:59.863560 master-0 kubenswrapper[7457]: I0319 09:20:59.861217 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk9cq\" (UniqueName: \"kubernetes.io/projected/a43e1754-66ff-49c4-8e64-65be7bae2819-kube-api-access-lk9cq\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:59.863560 master-0 kubenswrapper[7457]: I0319 09:20:59.861225 7457 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:59.866746 master-0 kubenswrapper[7457]: I0319 09:20:59.865899 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt"] Mar 19 09:20:59.962375 master-0 kubenswrapper[7457]: I0319 09:20:59.962072 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f585ebb1-6210-463b-af85-fb29e1e7dfa5-cache\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:20:59.962375 master-0 kubenswrapper[7457]: I0319 09:20:59.962134 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g4rw\" (UniqueName: \"kubernetes.io/projected/f585ebb1-6210-463b-af85-fb29e1e7dfa5-kube-api-access-5g4rw\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:20:59.962375 master-0 kubenswrapper[7457]: I0319 09:20:59.962179 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/f585ebb1-6210-463b-af85-fb29e1e7dfa5-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:20:59.962375 master-0 kubenswrapper[7457]: I0319 09:20:59.962198 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/f585ebb1-6210-463b-af85-fb29e1e7dfa5-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:20:59.962375 master-0 kubenswrapper[7457]: I0319 09:20:59.962305 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/f585ebb1-6210-463b-af85-fb29e1e7dfa5-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:20:59.969129 master-0 kubenswrapper[7457]: I0319 09:20:59.968112 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm"] Mar 19 09:20:59.969129 master-0 kubenswrapper[7457]: I0319 09:20:59.968897 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:20:59.971333 master-0 kubenswrapper[7457]: I0319 09:20:59.970906 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 09:20:59.971333 master-0 kubenswrapper[7457]: I0319 09:20:59.971185 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 09:20:59.971333 master-0 kubenswrapper[7457]: I0319 09:20:59.971325 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 09:20:59.984764 master-0 kubenswrapper[7457]: I0319 09:20:59.984722 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 09:21:00.004326 master-0 kubenswrapper[7457]: I0319 09:21:00.000113 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm"] Mar 19 09:21:00.063671 master-0 kubenswrapper[7457]: I0319 09:21:00.063466 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f585ebb1-6210-463b-af85-fb29e1e7dfa5-cache\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:00.063671 master-0 kubenswrapper[7457]: I0319 09:21:00.063567 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g4rw\" (UniqueName: \"kubernetes.io/projected/f585ebb1-6210-463b-af85-fb29e1e7dfa5-kube-api-access-5g4rw\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:00.063671 master-0 kubenswrapper[7457]: I0319 09:21:00.063639 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/3374940a-612d-4335-8236-3ffe8d6e73a5-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.063671 master-0 kubenswrapper[7457]: I0319 09:21:00.063663 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/f585ebb1-6210-463b-af85-fb29e1e7dfa5-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:00.063671 master-0 kubenswrapper[7457]: I0319 09:21:00.063682 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/f585ebb1-6210-463b-af85-fb29e1e7dfa5-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:00.063963 master-0 kubenswrapper[7457]: I0319 09:21:00.063724 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3374940a-612d-4335-8236-3ffe8d6e73a5-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.063963 master-0 kubenswrapper[7457]: I0319 09:21:00.063893 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/3374940a-612d-4335-8236-3ffe8d6e73a5-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.063963 master-0 kubenswrapper[7457]: I0319 09:21:00.063942 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/f585ebb1-6210-463b-af85-fb29e1e7dfa5-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:00.063963 master-0 kubenswrapper[7457]: I0319 09:21:00.063960 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmpcn\" (UniqueName: \"kubernetes.io/projected/3374940a-612d-4335-8236-3ffe8d6e73a5-kube-api-access-kmpcn\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.064080 master-0 kubenswrapper[7457]: I0319 09:21:00.063980 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3374940a-612d-4335-8236-3ffe8d6e73a5-cache\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.064080 master-0 kubenswrapper[7457]: I0319 09:21:00.063998 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/3374940a-612d-4335-8236-3ffe8d6e73a5-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.064137 master-0 kubenswrapper[7457]: I0319 09:21:00.064097 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f585ebb1-6210-463b-af85-fb29e1e7dfa5-cache\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:00.065054 master-0 kubenswrapper[7457]: I0319 09:21:00.064997 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/f585ebb1-6210-463b-af85-fb29e1e7dfa5-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:00.065882 master-0 kubenswrapper[7457]: I0319 09:21:00.065815 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/f585ebb1-6210-463b-af85-fb29e1e7dfa5-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:00.068995 master-0 kubenswrapper[7457]: I0319 09:21:00.068957 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/f585ebb1-6210-463b-af85-fb29e1e7dfa5-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:00.087621 master-0 kubenswrapper[7457]: I0319 09:21:00.087562 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g4rw\" (UniqueName: \"kubernetes.io/projected/f585ebb1-6210-463b-af85-fb29e1e7dfa5-kube-api-access-5g4rw\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:00.150499 master-0 kubenswrapper[7457]: I0319 09:21:00.150431 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:00.167236 master-0 kubenswrapper[7457]: I0319 09:21:00.165264 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/3374940a-612d-4335-8236-3ffe8d6e73a5-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.167236 master-0 kubenswrapper[7457]: I0319 09:21:00.165336 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpcn\" (UniqueName: \"kubernetes.io/projected/3374940a-612d-4335-8236-3ffe8d6e73a5-kube-api-access-kmpcn\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.167236 master-0 kubenswrapper[7457]: I0319 09:21:00.165362 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3374940a-612d-4335-8236-3ffe8d6e73a5-cache\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.167236 master-0 kubenswrapper[7457]: I0319 09:21:00.165382 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/3374940a-612d-4335-8236-3ffe8d6e73a5-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.167236 master-0 kubenswrapper[7457]: I0319 09:21:00.165443 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/3374940a-612d-4335-8236-3ffe8d6e73a5-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.167236 master-0 kubenswrapper[7457]: I0319 09:21:00.165567 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3374940a-612d-4335-8236-3ffe8d6e73a5-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.167236 master-0 kubenswrapper[7457]: I0319 09:21:00.166319 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/3374940a-612d-4335-8236-3ffe8d6e73a5-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.167236 master-0 kubenswrapper[7457]: I0319 09:21:00.166423 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/3374940a-612d-4335-8236-3ffe8d6e73a5-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.167236 master-0 kubenswrapper[7457]: I0319 09:21:00.167106 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3374940a-612d-4335-8236-3ffe8d6e73a5-cache\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.170519 master-0 kubenswrapper[7457]: I0319 09:21:00.170469 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/3374940a-612d-4335-8236-3ffe8d6e73a5-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.180476 master-0 kubenswrapper[7457]: I0319 09:21:00.180398 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3374940a-612d-4335-8236-3ffe8d6e73a5-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.183383 master-0 kubenswrapper[7457]: I0319 09:21:00.183327 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpcn\" (UniqueName: \"kubernetes.io/projected/3374940a-612d-4335-8236-3ffe8d6e73a5-kube-api-access-kmpcn\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.286630 master-0 kubenswrapper[7457]: I0319 09:21:00.286508 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:00.666042 master-0 kubenswrapper[7457]: I0319 09:21:00.665871 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59bcdf994c-hgrrg" Mar 19 09:21:00.666042 master-0 kubenswrapper[7457]: I0319 09:21:00.665898 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x" Mar 19 09:21:01.580007 master-0 kubenswrapper[7457]: I0319 09:21:01.579947 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 09:21:01.580514 master-0 kubenswrapper[7457]: I0319 09:21:01.580480 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:01.654853 master-0 kubenswrapper[7457]: I0319 09:21:01.653518 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 09:21:01.655050 master-0 kubenswrapper[7457]: I0319 09:21:01.654892 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f89774b7d-nrm4r"] Mar 19 09:21:01.655680 master-0 kubenswrapper[7457]: I0319 09:21:01.655367 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.663809 master-0 kubenswrapper[7457]: I0319 09:21:01.663770 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:21:01.667825 master-0 kubenswrapper[7457]: I0319 09:21:01.665036 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:21:01.667825 master-0 kubenswrapper[7457]: I0319 09:21:01.665257 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:21:01.667825 master-0 kubenswrapper[7457]: I0319 09:21:01.665318 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:21:01.667825 master-0 kubenswrapper[7457]: I0319 09:21:01.665635 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:21:01.670766 master-0 kubenswrapper[7457]: I0319 09:21:01.670267 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:21:01.682282 master-0 kubenswrapper[7457]: I0319 09:21:01.682231 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/506da1fd-f439-4b94-9940-3531ae009af0-kube-api-access\") pod \"installer-2-master-0\" (UID: \"506da1fd-f439-4b94-9940-3531ae009af0\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:01.682428 master-0 kubenswrapper[7457]: I0319 09:21:01.682365 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/506da1fd-f439-4b94-9940-3531ae009af0-var-lock\") pod \"installer-2-master-0\" (UID: \"506da1fd-f439-4b94-9940-3531ae009af0\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:01.682428 master-0 kubenswrapper[7457]: I0319 09:21:01.682386 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/506da1fd-f439-4b94-9940-3531ae009af0-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"506da1fd-f439-4b94-9940-3531ae009af0\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:01.711163 master-0 kubenswrapper[7457]: I0319 09:21:01.711081 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59bcdf994c-hgrrg"] Mar 19 09:21:01.712872 master-0 kubenswrapper[7457]: I0319 09:21:01.712835 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f89774b7d-nrm4r"] Mar 19 09:21:01.760623 master-0 kubenswrapper[7457]: I0319 09:21:01.747411 7457 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-59bcdf994c-hgrrg"] Mar 19 09:21:01.783270 master-0 kubenswrapper[7457]: I0319 09:21:01.783147 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f6sk\" (UniqueName: \"kubernetes.io/projected/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-kube-api-access-2f6sk\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.783270 master-0 kubenswrapper[7457]: I0319 09:21:01.783206 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-serving-cert\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.783270 master-0 kubenswrapper[7457]: I0319 09:21:01.783230 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/506da1fd-f439-4b94-9940-3531ae009af0-var-lock\") pod \"installer-2-master-0\" (UID: \"506da1fd-f439-4b94-9940-3531ae009af0\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:01.783270 master-0 kubenswrapper[7457]: I0319 09:21:01.783248 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-client-ca\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.783270 master-0 kubenswrapper[7457]: I0319 09:21:01.783270 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/506da1fd-f439-4b94-9940-3531ae009af0-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"506da1fd-f439-4b94-9940-3531ae009af0\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:01.783651 master-0 kubenswrapper[7457]: I0319 09:21:01.783330 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-proxy-ca-bundles\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.783651 master-0 kubenswrapper[7457]: I0319 09:21:01.783373 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/506da1fd-f439-4b94-9940-3531ae009af0-kube-api-access\") pod \"installer-2-master-0\" (UID: \"506da1fd-f439-4b94-9940-3531ae009af0\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:01.783651 master-0 kubenswrapper[7457]: I0319 09:21:01.783394 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-config\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.783651 master-0 kubenswrapper[7457]: I0319 09:21:01.783514 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/506da1fd-f439-4b94-9940-3531ae009af0-var-lock\") pod \"installer-2-master-0\" (UID: \"506da1fd-f439-4b94-9940-3531ae009af0\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:01.783651 master-0 kubenswrapper[7457]: I0319 09:21:01.783560 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/506da1fd-f439-4b94-9940-3531ae009af0-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"506da1fd-f439-4b94-9940-3531ae009af0\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:01.889075 master-0 kubenswrapper[7457]: I0319 09:21:01.886799 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-proxy-ca-bundles\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.889075 master-0 kubenswrapper[7457]: I0319 09:21:01.886903 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-config\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.889075 master-0 kubenswrapper[7457]: I0319 09:21:01.887295 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f6sk\" (UniqueName: \"kubernetes.io/projected/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-kube-api-access-2f6sk\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.889075 master-0 kubenswrapper[7457]: I0319 09:21:01.887377 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-serving-cert\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.889075 master-0 kubenswrapper[7457]: I0319 09:21:01.887418 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-client-ca\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.889075 master-0 kubenswrapper[7457]: I0319 09:21:01.887508 7457 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:01.889075 master-0 kubenswrapper[7457]: I0319 09:21:01.888783 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-proxy-ca-bundles\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.889844 master-0 kubenswrapper[7457]: I0319 09:21:01.889322 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-client-ca\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.889844 master-0 kubenswrapper[7457]: I0319 09:21:01.889382 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-config\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.894592 master-0 kubenswrapper[7457]: I0319 09:21:01.893809 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-serving-cert\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:01.915357 master-0 kubenswrapper[7457]: I0319 09:21:01.915325 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x"] Mar 19 09:21:01.929081 master-0 kubenswrapper[7457]: I0319 09:21:01.929025 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/506da1fd-f439-4b94-9940-3531ae009af0-kube-api-access\") pod \"installer-2-master-0\" (UID: \"506da1fd-f439-4b94-9940-3531ae009af0\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:01.956377 master-0 kubenswrapper[7457]: I0319 09:21:01.956297 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm"] Mar 19 09:21:01.957253 master-0 kubenswrapper[7457]: I0319 09:21:01.957232 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt"] Mar 19 09:21:01.968520 master-0 kubenswrapper[7457]: W0319 09:21:01.968471 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3374940a_612d_4335_8236_3ffe8d6e73a5.slice/crio-9c83b15dba69144ea146a0efbf84c34a34a7bbb646a98c775e5d8f6252c9784a WatchSource:0}: Error finding container 9c83b15dba69144ea146a0efbf84c34a34a7bbb646a98c775e5d8f6252c9784a: Status 404 returned error can't find the container with id 9c83b15dba69144ea146a0efbf84c34a34a7bbb646a98c775e5d8f6252c9784a Mar 19 09:21:01.969229 master-0 kubenswrapper[7457]: W0319 09:21:01.969187 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf585ebb1_6210_463b_af85_fb29e1e7dfa5.slice/crio-f1b224e2c3a94acf439ad26c1933da88f3f4e0a2666ac3998bdb5a26f2159e15 WatchSource:0}: Error finding container f1b224e2c3a94acf439ad26c1933da88f3f4e0a2666ac3998bdb5a26f2159e15: Status 404 returned error can't find the container with id f1b224e2c3a94acf439ad26c1933da88f3f4e0a2666ac3998bdb5a26f2159e15 Mar 19 09:21:02.075883 master-0 kubenswrapper[7457]: I0319 09:21:02.075358 7457 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86584b59c9-tws4x"] Mar 19 09:21:02.086160 master-0 kubenswrapper[7457]: I0319 09:21:02.086119 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f6sk\" (UniqueName: \"kubernetes.io/projected/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-kube-api-access-2f6sk\") pod \"controller-manager-6f89774b7d-nrm4r\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:02.190052 master-0 kubenswrapper[7457]: I0319 09:21:02.189999 7457 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a43e1754-66ff-49c4-8e64-65be7bae2819-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:02.190052 master-0 kubenswrapper[7457]: I0319 09:21:02.190040 7457 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a43e1754-66ff-49c4-8e64-65be7bae2819-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:02.201458 master-0 kubenswrapper[7457]: I0319 09:21:02.201408 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:02.309820 master-0 kubenswrapper[7457]: I0319 09:21:02.309273 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:02.341436 master-0 kubenswrapper[7457]: I0319 09:21:02.341394 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a43e1754-66ff-49c4-8e64-65be7bae2819" path="/var/lib/kubelet/pods/a43e1754-66ff-49c4-8e64-65be7bae2819/volumes" Mar 19 09:21:02.341893 master-0 kubenswrapper[7457]: I0319 09:21:02.341866 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28" path="/var/lib/kubelet/pods/c55b5e8a-4dad-4ee8-9d45-2e75d2e9ae28/volumes" Mar 19 09:21:02.510417 master-0 kubenswrapper[7457]: I0319 09:21:02.510378 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 09:21:02.542874 master-0 kubenswrapper[7457]: I0319 09:21:02.542313 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f89774b7d-nrm4r"] Mar 19 09:21:02.551478 master-0 kubenswrapper[7457]: W0319 09:21:02.551438 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc76b3023_dcc2_4ea3_b590_bf7fd718fc3f.slice/crio-0c33816ff9a691d939d3249416d2119d08c88e42a7df1593bf891ba67f33f9b1 WatchSource:0}: Error finding container 0c33816ff9a691d939d3249416d2119d08c88e42a7df1593bf891ba67f33f9b1: Status 404 returned error can't find the container with id 0c33816ff9a691d939d3249416d2119d08c88e42a7df1593bf891ba67f33f9b1 Mar 19 09:21:02.680317 master-0 kubenswrapper[7457]: I0319 09:21:02.680252 7457 generic.go:334] "Generic (PLEG): container finished" podID="64f60856-22dd-4560-acff-c620e17844a1" containerID="a09f4dbb43b5d238fce47264297abcc4f7a5bcbcd572aa1022072a1fc9dfe9a1" exitCode=0 Mar 19 09:21:02.683444 master-0 kubenswrapper[7457]: I0319 09:21:02.680335 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" event={"ID":"64f60856-22dd-4560-acff-c620e17844a1","Type":"ContainerDied","Data":"a09f4dbb43b5d238fce47264297abcc4f7a5bcbcd572aa1022072a1fc9dfe9a1"} Mar 19 09:21:02.683444 master-0 kubenswrapper[7457]: I0319 09:21:02.682964 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"506da1fd-f439-4b94-9940-3531ae009af0","Type":"ContainerStarted","Data":"28eab80a8fe32d784a906dc9171fdb8777758efe35f66d6344a6634d3ff92ae7"} Mar 19 09:21:02.684464 master-0 kubenswrapper[7457]: I0319 09:21:02.684415 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" event={"ID":"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f","Type":"ContainerStarted","Data":"0c33816ff9a691d939d3249416d2119d08c88e42a7df1593bf891ba67f33f9b1"} Mar 19 09:21:02.692660 master-0 kubenswrapper[7457]: I0319 09:21:02.692587 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" event={"ID":"f585ebb1-6210-463b-af85-fb29e1e7dfa5","Type":"ContainerStarted","Data":"9c263da94ea924e6f0c119c4cce3035f09c0b5bab333af500085e8b99c1b49c0"} Mar 19 09:21:02.692764 master-0 kubenswrapper[7457]: I0319 09:21:02.692673 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" event={"ID":"f585ebb1-6210-463b-af85-fb29e1e7dfa5","Type":"ContainerStarted","Data":"76f3de4c762cae478a577d1d16dfb1ee4af5fa68c3f21bb2b3efce645591fdc4"} Mar 19 09:21:02.692764 master-0 kubenswrapper[7457]: I0319 09:21:02.692687 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" event={"ID":"f585ebb1-6210-463b-af85-fb29e1e7dfa5","Type":"ContainerStarted","Data":"f1b224e2c3a94acf439ad26c1933da88f3f4e0a2666ac3998bdb5a26f2159e15"} Mar 19 09:21:02.693703 master-0 kubenswrapper[7457]: I0319 09:21:02.693614 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:02.695220 master-0 kubenswrapper[7457]: I0319 09:21:02.695176 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" event={"ID":"3374940a-612d-4335-8236-3ffe8d6e73a5","Type":"ContainerStarted","Data":"b1150c5ddc8f3dad3084433acf3e72b1db9c58ad0b6290a41f9524f5639d8b9c"} Mar 19 09:21:02.695220 master-0 kubenswrapper[7457]: I0319 09:21:02.695214 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" event={"ID":"3374940a-612d-4335-8236-3ffe8d6e73a5","Type":"ContainerStarted","Data":"acdf7fffe158ff8ee0e9f20dc73b044b6ce4390206c892bc77e699955565d456"} Mar 19 09:21:02.695349 master-0 kubenswrapper[7457]: I0319 09:21:02.695229 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" event={"ID":"3374940a-612d-4335-8236-3ffe8d6e73a5","Type":"ContainerStarted","Data":"9c83b15dba69144ea146a0efbf84c34a34a7bbb646a98c775e5d8f6252c9784a"} Mar 19 09:21:02.695707 master-0 kubenswrapper[7457]: I0319 09:21:02.695673 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:03.016616 master-0 kubenswrapper[7457]: I0319 09:21:03.016506 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" podStartSLOduration=4.016490464 podStartE2EDuration="4.016490464s" podCreationTimestamp="2026-03-19 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:02.957774701 +0000 UTC m=+58.813114071" watchObservedRunningTime="2026-03-19 09:21:03.016490464 +0000 UTC m=+58.871829834" Mar 19 09:21:03.275560 master-0 kubenswrapper[7457]: I0319 09:21:03.274636 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:21:03.275560 master-0 kubenswrapper[7457]: I0319 09:21:03.275123 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:21:03.357487 master-0 kubenswrapper[7457]: I0319 09:21:03.357090 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:21:03.482618 master-0 kubenswrapper[7457]: I0319 09:21:03.476906 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" podStartSLOduration=4.476888606 podStartE2EDuration="4.476888606s" podCreationTimestamp="2026-03-19 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:03.015837888 +0000 UTC m=+58.871177278" watchObservedRunningTime="2026-03-19 09:21:03.476888606 +0000 UTC m=+59.332227976" Mar 19 09:21:03.713771 master-0 kubenswrapper[7457]: I0319 09:21:03.713717 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"506da1fd-f439-4b94-9940-3531ae009af0","Type":"ContainerStarted","Data":"40c2d0773871b0c8fab92e423550d9b8bdabe32991acae18be97a97bed1760e9"} Mar 19 09:21:03.716225 master-0 kubenswrapper[7457]: I0319 09:21:03.716199 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" event={"ID":"64f60856-22dd-4560-acff-c620e17844a1","Type":"ContainerStarted","Data":"130317d2b36f3a324391f5babb292e23818dadbc6dd459e4002d0bba9a53ca2d"} Mar 19 09:21:03.716298 master-0 kubenswrapper[7457]: I0319 09:21:03.716231 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" event={"ID":"64f60856-22dd-4560-acff-c620e17844a1","Type":"ContainerStarted","Data":"50b6589c30abb0e98915f4367bb65e4af9a76b84d574e33cab80c1e110d93fe1"} Mar 19 09:21:03.720419 master-0 kubenswrapper[7457]: I0319 09:21:03.720370 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:21:03.955331 master-0 kubenswrapper[7457]: I0319 09:21:03.955262 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=3.955242479 podStartE2EDuration="3.955242479s" podCreationTimestamp="2026-03-19 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:03.952916049 +0000 UTC m=+59.808255419" watchObservedRunningTime="2026-03-19 09:21:03.955242479 +0000 UTC m=+59.810581859" Mar 19 09:21:04.035853 master-0 kubenswrapper[7457]: I0319 09:21:04.034786 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" podStartSLOduration=8.973335302 podStartE2EDuration="14.034768587s" podCreationTimestamp="2026-03-19 09:20:50 +0000 UTC" firstStartedPulling="2026-03-19 09:20:56.847877625 +0000 UTC m=+52.703216995" lastFinishedPulling="2026-03-19 09:21:01.90931091 +0000 UTC m=+57.764650280" observedRunningTime="2026-03-19 09:21:04.027428529 +0000 UTC m=+59.882767909" watchObservedRunningTime="2026-03-19 09:21:04.034768587 +0000 UTC m=+59.890107957" Mar 19 09:21:04.621934 master-0 kubenswrapper[7457]: I0319 09:21:04.621738 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn"] Mar 19 09:21:04.622454 master-0 kubenswrapper[7457]: I0319 09:21:04.622434 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.625570 master-0 kubenswrapper[7457]: I0319 09:21:04.625535 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:21:04.625805 master-0 kubenswrapper[7457]: I0319 09:21:04.625782 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:21:04.626359 master-0 kubenswrapper[7457]: I0319 09:21:04.626328 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:21:04.626757 master-0 kubenswrapper[7457]: I0319 09:21:04.626710 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:21:04.627037 master-0 kubenswrapper[7457]: I0319 09:21:04.627007 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:21:04.650161 master-0 kubenswrapper[7457]: I0319 09:21:04.650103 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn"] Mar 19 09:21:04.736129 master-0 kubenswrapper[7457]: I0319 09:21:04.736040 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3526fc7-b3af-4146-a791-fad627e8c9fa-config\") pod \"route-controller-manager-597786f6d8-qsfjn\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.736934 master-0 kubenswrapper[7457]: I0319 09:21:04.736187 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3526fc7-b3af-4146-a791-fad627e8c9fa-client-ca\") pod \"route-controller-manager-597786f6d8-qsfjn\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.736934 master-0 kubenswrapper[7457]: I0319 09:21:04.736265 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3526fc7-b3af-4146-a791-fad627e8c9fa-serving-cert\") pod \"route-controller-manager-597786f6d8-qsfjn\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.736934 master-0 kubenswrapper[7457]: I0319 09:21:04.736310 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ndzd\" (UniqueName: \"kubernetes.io/projected/c3526fc7-b3af-4146-a791-fad627e8c9fa-kube-api-access-6ndzd\") pod \"route-controller-manager-597786f6d8-qsfjn\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.837986 master-0 kubenswrapper[7457]: I0319 09:21:04.837912 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3526fc7-b3af-4146-a791-fad627e8c9fa-config\") pod \"route-controller-manager-597786f6d8-qsfjn\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.838215 master-0 kubenswrapper[7457]: I0319 09:21:04.838018 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3526fc7-b3af-4146-a791-fad627e8c9fa-client-ca\") pod \"route-controller-manager-597786f6d8-qsfjn\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.838215 master-0 kubenswrapper[7457]: I0319 09:21:04.838039 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3526fc7-b3af-4146-a791-fad627e8c9fa-serving-cert\") pod \"route-controller-manager-597786f6d8-qsfjn\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.838215 master-0 kubenswrapper[7457]: I0319 09:21:04.838156 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ndzd\" (UniqueName: \"kubernetes.io/projected/c3526fc7-b3af-4146-a791-fad627e8c9fa-kube-api-access-6ndzd\") pod \"route-controller-manager-597786f6d8-qsfjn\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.840781 master-0 kubenswrapper[7457]: I0319 09:21:04.840752 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3526fc7-b3af-4146-a791-fad627e8c9fa-client-ca\") pod \"route-controller-manager-597786f6d8-qsfjn\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.841035 master-0 kubenswrapper[7457]: I0319 09:21:04.841014 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3526fc7-b3af-4146-a791-fad627e8c9fa-config\") pod \"route-controller-manager-597786f6d8-qsfjn\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.846057 master-0 kubenswrapper[7457]: I0319 09:21:04.845675 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3526fc7-b3af-4146-a791-fad627e8c9fa-serving-cert\") pod \"route-controller-manager-597786f6d8-qsfjn\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.858692 master-0 kubenswrapper[7457]: I0319 09:21:04.858653 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ndzd\" (UniqueName: \"kubernetes.io/projected/c3526fc7-b3af-4146-a791-fad627e8c9fa-kube-api-access-6ndzd\") pod \"route-controller-manager-597786f6d8-qsfjn\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:04.951357 master-0 kubenswrapper[7457]: I0319 09:21:04.951233 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:05.199827 master-0 kubenswrapper[7457]: I0319 09:21:05.198648 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn"] Mar 19 09:21:05.729427 master-0 kubenswrapper[7457]: I0319 09:21:05.729139 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" event={"ID":"c3526fc7-b3af-4146-a791-fad627e8c9fa","Type":"ContainerStarted","Data":"e391dcffd584ac2ff23381e87dc61df1ca458dbf6b1c4be91fe138eb91648bf7"} Mar 19 09:21:06.593373 master-0 kubenswrapper[7457]: I0319 09:21:06.593297 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:21:06.594197 master-0 kubenswrapper[7457]: I0319 09:21:06.594148 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:21:06.601263 master-0 kubenswrapper[7457]: I0319 09:21:06.601204 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:21:06.736993 master-0 kubenswrapper[7457]: I0319 09:21:06.735740 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" event={"ID":"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f","Type":"ContainerStarted","Data":"e78d3dce63ea732d696eb7bff751b92b4362afe268916ddac8100e99641f0a5b"} Mar 19 09:21:06.736993 master-0 kubenswrapper[7457]: I0319 09:21:06.736788 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:06.740199 master-0 kubenswrapper[7457]: I0319 09:21:06.740166 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:21:06.740688 master-0 kubenswrapper[7457]: I0319 09:21:06.740647 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:06.788783 master-0 kubenswrapper[7457]: I0319 09:21:06.788722 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" podStartSLOduration=3.87228811 podStartE2EDuration="7.788704346s" podCreationTimestamp="2026-03-19 09:20:59 +0000 UTC" firstStartedPulling="2026-03-19 09:21:02.555348705 +0000 UTC m=+58.410688075" lastFinishedPulling="2026-03-19 09:21:06.471764941 +0000 UTC m=+62.327104311" observedRunningTime="2026-03-19 09:21:06.788052649 +0000 UTC m=+62.643392029" watchObservedRunningTime="2026-03-19 09:21:06.788704346 +0000 UTC m=+62.644043716" Mar 19 09:21:07.193655 master-0 kubenswrapper[7457]: I0319 09:21:07.193587 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 09:21:07.194571 master-0 kubenswrapper[7457]: I0319 09:21:07.194270 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:07.197152 master-0 kubenswrapper[7457]: I0319 09:21:07.196972 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 09:21:07.208103 master-0 kubenswrapper[7457]: I0319 09:21:07.207476 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 09:21:07.299576 master-0 kubenswrapper[7457]: I0319 09:21:07.299500 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/434aabfa-50db-407e-92d3-a034696613e3-kube-api-access\") pod \"installer-1-master-0\" (UID: \"434aabfa-50db-407e-92d3-a034696613e3\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:07.299789 master-0 kubenswrapper[7457]: I0319 09:21:07.299611 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/434aabfa-50db-407e-92d3-a034696613e3-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"434aabfa-50db-407e-92d3-a034696613e3\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:07.299789 master-0 kubenswrapper[7457]: I0319 09:21:07.299642 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/434aabfa-50db-407e-92d3-a034696613e3-var-lock\") pod \"installer-1-master-0\" (UID: \"434aabfa-50db-407e-92d3-a034696613e3\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:07.400698 master-0 kubenswrapper[7457]: I0319 09:21:07.400632 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/434aabfa-50db-407e-92d3-a034696613e3-kube-api-access\") pod \"installer-1-master-0\" (UID: \"434aabfa-50db-407e-92d3-a034696613e3\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:07.400698 master-0 kubenswrapper[7457]: I0319 09:21:07.400708 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/434aabfa-50db-407e-92d3-a034696613e3-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"434aabfa-50db-407e-92d3-a034696613e3\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:07.400924 master-0 kubenswrapper[7457]: I0319 09:21:07.400733 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/434aabfa-50db-407e-92d3-a034696613e3-var-lock\") pod \"installer-1-master-0\" (UID: \"434aabfa-50db-407e-92d3-a034696613e3\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:07.400924 master-0 kubenswrapper[7457]: I0319 09:21:07.400877 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/434aabfa-50db-407e-92d3-a034696613e3-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"434aabfa-50db-407e-92d3-a034696613e3\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:07.400987 master-0 kubenswrapper[7457]: I0319 09:21:07.400945 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/434aabfa-50db-407e-92d3-a034696613e3-var-lock\") pod \"installer-1-master-0\" (UID: \"434aabfa-50db-407e-92d3-a034696613e3\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:07.442434 master-0 kubenswrapper[7457]: I0319 09:21:07.440963 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/434aabfa-50db-407e-92d3-a034696613e3-kube-api-access\") pod \"installer-1-master-0\" (UID: \"434aabfa-50db-407e-92d3-a034696613e3\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:07.563026 master-0 kubenswrapper[7457]: I0319 09:21:07.562934 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:08.225984 master-0 kubenswrapper[7457]: I0319 09:21:08.225616 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 09:21:08.745652 master-0 kubenswrapper[7457]: I0319 09:21:08.745583 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"434aabfa-50db-407e-92d3-a034696613e3","Type":"ContainerStarted","Data":"9f695fbbb2ac33712845536e84cccd0ed476913549534361c504ad37ba881e39"} Mar 19 09:21:08.972329 master-0 kubenswrapper[7457]: I0319 09:21:08.971742 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 09:21:08.972930 master-0 kubenswrapper[7457]: I0319 09:21:08.972568 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="506da1fd-f439-4b94-9940-3531ae009af0" containerName="installer" containerID="cri-o://40c2d0773871b0c8fab92e423550d9b8bdabe32991acae18be97a97bed1760e9" gracePeriod=30 Mar 19 09:21:09.235636 master-0 kubenswrapper[7457]: I0319 09:21:09.235504 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:21:09.235636 master-0 kubenswrapper[7457]: I0319 09:21:09.235566 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:21:09.236444 master-0 kubenswrapper[7457]: I0319 09:21:09.235733 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:21:09.236444 master-0 kubenswrapper[7457]: I0319 09:21:09.236075 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:21:09.236444 master-0 kubenswrapper[7457]: I0319 09:21:09.236129 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:21:09.236444 master-0 kubenswrapper[7457]: I0319 09:21:09.236165 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:21:09.236745 master-0 kubenswrapper[7457]: I0319 09:21:09.236684 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:21:09.236809 master-0 kubenswrapper[7457]: I0319 09:21:09.236756 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:21:09.237059 master-0 kubenswrapper[7457]: I0319 09:21:09.237021 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:21:09.237487 master-0 kubenswrapper[7457]: I0319 09:21:09.237270 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:21:09.237487 master-0 kubenswrapper[7457]: I0319 09:21:09.237353 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:21:09.237487 master-0 kubenswrapper[7457]: I0319 09:21:09.237386 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:21:09.237487 master-0 kubenswrapper[7457]: I0319 09:21:09.237430 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:21:09.241268 master-0 kubenswrapper[7457]: I0319 09:21:09.241209 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:21:09.241730 master-0 kubenswrapper[7457]: I0319 09:21:09.241688 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:21:09.241823 master-0 kubenswrapper[7457]: I0319 09:21:09.241797 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:21:09.242803 master-0 kubenswrapper[7457]: I0319 09:21:09.242744 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:21:09.243934 master-0 kubenswrapper[7457]: I0319 09:21:09.243871 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:21:09.245374 master-0 kubenswrapper[7457]: I0319 09:21:09.245330 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:21:09.245374 master-0 kubenswrapper[7457]: I0319 09:21:09.245353 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:21:09.246079 master-0 kubenswrapper[7457]: I0319 09:21:09.246042 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:21:09.246592 master-0 kubenswrapper[7457]: I0319 09:21:09.246543 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:21:09.247667 master-0 kubenswrapper[7457]: I0319 09:21:09.247625 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:21:09.247825 master-0 kubenswrapper[7457]: I0319 09:21:09.247789 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:21:09.250173 master-0 kubenswrapper[7457]: I0319 09:21:09.250113 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:21:09.250173 master-0 kubenswrapper[7457]: I0319 09:21:09.250123 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:21:09.482954 master-0 kubenswrapper[7457]: I0319 09:21:09.482835 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:21:09.482954 master-0 kubenswrapper[7457]: I0319 09:21:09.482870 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:21:09.483195 master-0 kubenswrapper[7457]: I0319 09:21:09.483041 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:21:09.483195 master-0 kubenswrapper[7457]: I0319 09:21:09.483121 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:21:09.483287 master-0 kubenswrapper[7457]: I0319 09:21:09.483253 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:21:09.483325 master-0 kubenswrapper[7457]: I0319 09:21:09.483289 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:21:09.485332 master-0 kubenswrapper[7457]: I0319 09:21:09.485297 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:21:09.489876 master-0 kubenswrapper[7457]: I0319 09:21:09.489835 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:21:09.489953 master-0 kubenswrapper[7457]: I0319 09:21:09.489923 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:21:09.489995 master-0 kubenswrapper[7457]: I0319 09:21:09.489948 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:21:09.490174 master-0 kubenswrapper[7457]: I0319 09:21:09.490117 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:21:09.490270 master-0 kubenswrapper[7457]: I0319 09:21:09.490214 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:21:09.751771 master-0 kubenswrapper[7457]: I0319 09:21:09.751689 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_506da1fd-f439-4b94-9940-3531ae009af0/installer/0.log" Mar 19 09:21:09.751771 master-0 kubenswrapper[7457]: I0319 09:21:09.751734 7457 generic.go:334] "Generic (PLEG): container finished" podID="506da1fd-f439-4b94-9940-3531ae009af0" containerID="40c2d0773871b0c8fab92e423550d9b8bdabe32991acae18be97a97bed1760e9" exitCode=1 Mar 19 09:21:09.751974 master-0 kubenswrapper[7457]: I0319 09:21:09.751783 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"506da1fd-f439-4b94-9940-3531ae009af0","Type":"ContainerDied","Data":"40c2d0773871b0c8fab92e423550d9b8bdabe32991acae18be97a97bed1760e9"} Mar 19 09:21:09.753400 master-0 kubenswrapper[7457]: I0319 09:21:09.753380 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"434aabfa-50db-407e-92d3-a034696613e3","Type":"ContainerStarted","Data":"5a3b40e5aadf949e686ac4f447f2417ca9edf3ac74f9cc8e180b0ad3fbdc1cbc"} Mar 19 09:21:09.956988 master-0 kubenswrapper[7457]: I0319 09:21:09.956665 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_506da1fd-f439-4b94-9940-3531ae009af0/installer/0.log" Mar 19 09:21:09.957168 master-0 kubenswrapper[7457]: I0319 09:21:09.957033 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:10.017931 master-0 kubenswrapper[7457]: I0319 09:21:10.016610 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6"] Mar 19 09:21:10.017931 master-0 kubenswrapper[7457]: I0319 09:21:10.016625 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=3.016609735 podStartE2EDuration="3.016609735s" podCreationTimestamp="2026-03-19 09:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:10.015437355 +0000 UTC m=+65.870776755" watchObservedRunningTime="2026-03-19 09:21:10.016609735 +0000 UTC m=+65.871949105" Mar 19 09:21:10.053781 master-0 kubenswrapper[7457]: I0319 09:21:10.053029 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/506da1fd-f439-4b94-9940-3531ae009af0-kubelet-dir\") pod \"506da1fd-f439-4b94-9940-3531ae009af0\" (UID: \"506da1fd-f439-4b94-9940-3531ae009af0\") " Mar 19 09:21:10.053781 master-0 kubenswrapper[7457]: I0319 09:21:10.053089 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/506da1fd-f439-4b94-9940-3531ae009af0-var-lock\") pod \"506da1fd-f439-4b94-9940-3531ae009af0\" (UID: \"506da1fd-f439-4b94-9940-3531ae009af0\") " Mar 19 09:21:10.053781 master-0 kubenswrapper[7457]: I0319 09:21:10.053163 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/506da1fd-f439-4b94-9940-3531ae009af0-kube-api-access\") pod \"506da1fd-f439-4b94-9940-3531ae009af0\" (UID: \"506da1fd-f439-4b94-9940-3531ae009af0\") " Mar 19 09:21:10.054197 master-0 kubenswrapper[7457]: I0319 09:21:10.054145 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/506da1fd-f439-4b94-9940-3531ae009af0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "506da1fd-f439-4b94-9940-3531ae009af0" (UID: "506da1fd-f439-4b94-9940-3531ae009af0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:10.054249 master-0 kubenswrapper[7457]: I0319 09:21:10.054199 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/506da1fd-f439-4b94-9940-3531ae009af0-var-lock" (OuterVolumeSpecName: "var-lock") pod "506da1fd-f439-4b94-9940-3531ae009af0" (UID: "506da1fd-f439-4b94-9940-3531ae009af0"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:10.063577 master-0 kubenswrapper[7457]: I0319 09:21:10.060415 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/506da1fd-f439-4b94-9940-3531ae009af0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "506da1fd-f439-4b94-9940-3531ae009af0" (UID: "506da1fd-f439-4b94-9940-3531ae009af0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:10.155165 master-0 kubenswrapper[7457]: I0319 09:21:10.154272 7457 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/506da1fd-f439-4b94-9940-3531ae009af0-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:10.155165 master-0 kubenswrapper[7457]: I0319 09:21:10.154311 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/506da1fd-f439-4b94-9940-3531ae009af0-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:10.155165 master-0 kubenswrapper[7457]: I0319 09:21:10.154322 7457 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/506da1fd-f439-4b94-9940-3531ae009af0-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:10.162091 master-0 kubenswrapper[7457]: I0319 09:21:10.158285 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:21:10.225433 master-0 kubenswrapper[7457]: I0319 09:21:10.221305 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-gxznr"] Mar 19 09:21:10.290177 master-0 kubenswrapper[7457]: I0319 09:21:10.289090 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:21:10.478698 master-0 kubenswrapper[7457]: I0319 09:21:10.477518 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb"] Mar 19 09:21:10.478698 master-0 kubenswrapper[7457]: I0319 09:21:10.477607 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d"] Mar 19 09:21:10.478698 master-0 kubenswrapper[7457]: I0319 09:21:10.477618 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5"] Mar 19 09:21:10.759015 master-0 kubenswrapper[7457]: I0319 09:21:10.758968 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" event={"ID":"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e","Type":"ContainerStarted","Data":"202f921c55b75970ad21b44a2165cf9cf2366346959189388624b5cff168cafb"} Mar 19 09:21:10.760176 master-0 kubenswrapper[7457]: I0319 09:21:10.760143 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" event={"ID":"c247d991-809e-46b6-9617-9b05007b7560","Type":"ContainerStarted","Data":"68534953ce4f9d64b4ca25577e4617ff34537dd8175ec1c79125e169063bd6f3"} Mar 19 09:21:10.762603 master-0 kubenswrapper[7457]: I0319 09:21:10.762556 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_506da1fd-f439-4b94-9940-3531ae009af0/installer/0.log" Mar 19 09:21:10.762661 master-0 kubenswrapper[7457]: I0319 09:21:10.762646 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"506da1fd-f439-4b94-9940-3531ae009af0","Type":"ContainerDied","Data":"28eab80a8fe32d784a906dc9171fdb8777758efe35f66d6344a6634d3ff92ae7"} Mar 19 09:21:10.762726 master-0 kubenswrapper[7457]: I0319 09:21:10.762679 7457 scope.go:117] "RemoveContainer" containerID="40c2d0773871b0c8fab92e423550d9b8bdabe32991acae18be97a97bed1760e9" Mar 19 09:21:10.762818 master-0 kubenswrapper[7457]: I0319 09:21:10.762749 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:10.763995 master-0 kubenswrapper[7457]: I0319 09:21:10.763963 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" event={"ID":"8beda3a0-a653-4810-b3f2-d25badb21ab1","Type":"ContainerStarted","Data":"3d5b3f08e9980af7a4eb46a62a5af4211db365f64342fe1705f26fa41b7b1331"} Mar 19 09:21:10.765937 master-0 kubenswrapper[7457]: I0319 09:21:10.765875 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" event={"ID":"51b88818-5108-40db-90c8-4f2e7198959e","Type":"ContainerStarted","Data":"2451d7e3dd79303504d5964f5bc9fe498e3fce32e9bf236a0e1ab73d89c4fa39"} Mar 19 09:21:10.769066 master-0 kubenswrapper[7457]: I0319 09:21:10.769036 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" event={"ID":"c3526fc7-b3af-4146-a791-fad627e8c9fa","Type":"ContainerStarted","Data":"66bf409d4e408d987de5c230847e4aa6700de4c6d1fcbc684926aa5267864de3"} Mar 19 09:21:10.769362 master-0 kubenswrapper[7457]: I0319 09:21:10.769321 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:10.770966 master-0 kubenswrapper[7457]: I0319 09:21:10.770905 7457 generic.go:334] "Generic (PLEG): container finished" podID="e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb" containerID="2c0b681fce22722dc6bda98cf745e2b79d2558bec9534ca23b3f5d2d7fcdef7a" exitCode=0 Mar 19 09:21:10.771065 master-0 kubenswrapper[7457]: I0319 09:21:10.771022 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" event={"ID":"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb","Type":"ContainerDied","Data":"2c0b681fce22722dc6bda98cf745e2b79d2558bec9534ca23b3f5d2d7fcdef7a"} Mar 19 09:21:10.771625 master-0 kubenswrapper[7457]: I0319 09:21:10.771578 7457 scope.go:117] "RemoveContainer" containerID="2c0b681fce22722dc6bda98cf745e2b79d2558bec9534ca23b3f5d2d7fcdef7a" Mar 19 09:21:10.772379 master-0 kubenswrapper[7457]: I0319 09:21:10.772220 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" event={"ID":"9076d131-644a-4332-8a70-34f6b0f71575","Type":"ContainerStarted","Data":"3e14d5393be022eff24be7e8d5e671dc610671f728796a2cb5a2309e1895b5f0"} Mar 19 09:21:10.783620 master-0 kubenswrapper[7457]: I0319 09:21:10.783492 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" event={"ID":"259794ab-d027-497a-b08e-5a6d79057668","Type":"ContainerStarted","Data":"d9034c9252b2dbe49fa20bf241af605c2b9efd4ec2d903f7338b331b9a335a60"} Mar 19 09:21:11.770451 master-0 kubenswrapper[7457]: I0319 09:21:11.770364 7457 patch_prober.go:28] interesting pod/route-controller-manager-597786f6d8-qsfjn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.41:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:21:11.771244 master-0 kubenswrapper[7457]: I0319 09:21:11.770445 7457 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" podUID="c3526fc7-b3af-4146-a791-fad627e8c9fa" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.41:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:21:12.580403 master-0 kubenswrapper[7457]: I0319 09:21:12.579941 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nq9vs"] Mar 19 09:21:12.789229 master-0 kubenswrapper[7457]: I0319 09:21:12.789150 7457 patch_prober.go:28] interesting pod/route-controller-manager-597786f6d8-qsfjn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.41:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:21:12.789229 master-0 kubenswrapper[7457]: I0319 09:21:12.789223 7457 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" podUID="c3526fc7-b3af-4146-a791-fad627e8c9fa" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.41:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:21:12.794934 master-0 kubenswrapper[7457]: I0319 09:21:12.794888 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" event={"ID":"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb","Type":"ContainerStarted","Data":"502fb65a66bd678cd094ffbc2c7b5d1efb90edcd0d5865eef580409121caf731"} Mar 19 09:21:12.796116 master-0 kubenswrapper[7457]: I0319 09:21:12.796076 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nq9vs" event={"ID":"13072c08-c77c-4170-9ebe-98d63968747b","Type":"ContainerStarted","Data":"9e6502d00c4d560279a6b84e0eac2864639061d852a900f00e6d52ff81453134"} Mar 19 09:21:14.094637 master-0 kubenswrapper[7457]: I0319 09:21:14.094577 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 09:21:14.095333 master-0 kubenswrapper[7457]: E0319 09:21:14.094787 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506da1fd-f439-4b94-9940-3531ae009af0" containerName="installer" Mar 19 09:21:14.095333 master-0 kubenswrapper[7457]: I0319 09:21:14.094802 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="506da1fd-f439-4b94-9940-3531ae009af0" containerName="installer" Mar 19 09:21:14.095333 master-0 kubenswrapper[7457]: I0319 09:21:14.094892 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="506da1fd-f439-4b94-9940-3531ae009af0" containerName="installer" Mar 19 09:21:14.095333 master-0 kubenswrapper[7457]: I0319 09:21:14.095250 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.115697 master-0 kubenswrapper[7457]: I0319 09:21:14.112113 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh"] Mar 19 09:21:14.135926 master-0 kubenswrapper[7457]: I0319 09:21:14.132621 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd"] Mar 19 09:21:14.135926 master-0 kubenswrapper[7457]: I0319 09:21:14.135463 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-cbw4r"] Mar 19 09:21:14.145211 master-0 kubenswrapper[7457]: I0319 09:21:14.144996 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692"] Mar 19 09:21:14.161677 master-0 kubenswrapper[7457]: I0319 09:21:14.159080 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj"] Mar 19 09:21:14.185457 master-0 kubenswrapper[7457]: I0319 09:21:14.180389 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 09:21:14.201639 master-0 kubenswrapper[7457]: I0319 09:21:14.190644 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 09:21:14.220729 master-0 kubenswrapper[7457]: I0319 09:21:14.207817 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-var-lock\") pod \"installer-3-master-0\" (UID: \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.220729 master-0 kubenswrapper[7457]: I0319 09:21:14.207949 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.220729 master-0 kubenswrapper[7457]: I0319 09:21:14.207981 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-kube-api-access\") pod \"installer-3-master-0\" (UID: \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.220729 master-0 kubenswrapper[7457]: I0319 09:21:14.210218 7457 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 09:21:14.289554 master-0 kubenswrapper[7457]: I0319 09:21:14.285661 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" podStartSLOduration=10.467882429 podStartE2EDuration="15.284493317s" podCreationTimestamp="2026-03-19 09:20:59 +0000 UTC" firstStartedPulling="2026-03-19 09:21:05.214206833 +0000 UTC m=+61.069546203" lastFinishedPulling="2026-03-19 09:21:10.030817721 +0000 UTC m=+65.886157091" observedRunningTime="2026-03-19 09:21:14.274581051 +0000 UTC m=+70.129920431" watchObservedRunningTime="2026-03-19 09:21:14.284493317 +0000 UTC m=+70.139832707" Mar 19 09:21:14.319653 master-0 kubenswrapper[7457]: I0319 09:21:14.316000 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-var-lock\") pod \"installer-3-master-0\" (UID: \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.319653 master-0 kubenswrapper[7457]: I0319 09:21:14.316154 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-var-lock\") pod \"installer-3-master-0\" (UID: \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.319653 master-0 kubenswrapper[7457]: I0319 09:21:14.316408 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.319653 master-0 kubenswrapper[7457]: I0319 09:21:14.316510 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.319653 master-0 kubenswrapper[7457]: I0319 09:21:14.316601 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-kube-api-access\") pod \"installer-3-master-0\" (UID: \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.384573 master-0 kubenswrapper[7457]: I0319 09:21:14.384477 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-kube-api-access\") pod \"installer-3-master-0\" (UID: \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.388572 master-0 kubenswrapper[7457]: I0319 09:21:14.388421 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="506da1fd-f439-4b94-9940-3531ae009af0" path="/var/lib/kubelet/pods/506da1fd-f439-4b94-9940-3531ae009af0/volumes" Mar 19 09:21:14.547899 master-0 kubenswrapper[7457]: I0319 09:21:14.547857 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.813215 master-0 kubenswrapper[7457]: I0319 09:21:14.810341 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" event={"ID":"9a6c1523-e77c-4aac-814c-05d41215c42f","Type":"ContainerStarted","Data":"32b3d465eca181837f574822533acf4722de4ec91ef144516553a5bbdf0e91dc"} Mar 19 09:21:14.813215 master-0 kubenswrapper[7457]: I0319 09:21:14.810688 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" event={"ID":"9a6c1523-e77c-4aac-814c-05d41215c42f","Type":"ContainerStarted","Data":"5001c0304645acfa799077998786ddfe7d90e702ba8e83ddc5ed0850af9bd30d"} Mar 19 09:21:14.813215 master-0 kubenswrapper[7457]: I0319 09:21:14.813055 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" event={"ID":"03d12dab-1215-4c1f-a9f5-27ea7174d308","Type":"ContainerStarted","Data":"c604e07b23c824fe44edd155fd3bcc4d87de07b9af516a6fc04d64e9a7ef4a11"} Mar 19 09:21:14.817598 master-0 kubenswrapper[7457]: I0319 09:21:14.814416 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" event={"ID":"16d2930b-486b-492d-983e-c6702d8f53a7","Type":"ContainerStarted","Data":"4e6b0fbcf10efb9d378e7013c9ee95c7eea5f13187283f4e3dcc1192d68f1166"} Mar 19 09:21:14.817598 master-0 kubenswrapper[7457]: I0319 09:21:14.815736 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" event={"ID":"7b29cb7b-26d2-4fab-9e03-2d7fdf937592","Type":"ContainerStarted","Data":"2c994266a2d02f0511c56e203e02c66ec993c8a4956cebe37152ed3179a4c4ff"} Mar 19 09:21:14.817817 master-0 kubenswrapper[7457]: I0319 09:21:14.817786 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" event={"ID":"8c8ee765-76b8-4cde-8acb-6e5edd1b8149","Type":"ContainerStarted","Data":"82537ef54e6382f32d5fe80aff3875a880fc715c44848fde8e9d22a20125f223"} Mar 19 09:21:14.863907 master-0 kubenswrapper[7457]: I0319 09:21:14.863728 7457 generic.go:334] "Generic (PLEG): container finished" podID="4abcf2ea-50f5-4d62-8a23-583438e5b451" containerID="0ca36c4228886afdfb6b80a61e2423dd76188b00985839f0f0ad53c1f5d31db7" exitCode=0 Mar 19 09:21:14.863907 master-0 kubenswrapper[7457]: I0319 09:21:14.863784 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" event={"ID":"4abcf2ea-50f5-4d62-8a23-583438e5b451","Type":"ContainerDied","Data":"0ca36c4228886afdfb6b80a61e2423dd76188b00985839f0f0ad53c1f5d31db7"} Mar 19 09:21:14.864125 master-0 kubenswrapper[7457]: I0319 09:21:14.864110 7457 scope.go:117] "RemoveContainer" containerID="0ca36c4228886afdfb6b80a61e2423dd76188b00985839f0f0ad53c1f5d31db7" Mar 19 09:21:14.955997 master-0 kubenswrapper[7457]: I0319 09:21:14.955949 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:15.304768 master-0 kubenswrapper[7457]: I0319 09:21:15.304693 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 09:21:15.872325 master-0 kubenswrapper[7457]: I0319 09:21:15.872209 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" event={"ID":"4abcf2ea-50f5-4d62-8a23-583438e5b451","Type":"ContainerStarted","Data":"84cb64da9c8a09f0b6ce4a55bb02c594389448a951da804bd9d7fc776340256c"} Mar 19 09:21:15.880752 master-0 kubenswrapper[7457]: I0319 09:21:15.880694 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7fcfef1e-0652-4c52-a0a8-dfbf15408d03","Type":"ContainerStarted","Data":"e6e89ca8e17f62a01908661aaa2823a47bd6605150b0289a08fa08b57ffe48cb"} Mar 19 09:21:15.880752 master-0 kubenswrapper[7457]: I0319 09:21:15.880724 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7fcfef1e-0652-4c52-a0a8-dfbf15408d03","Type":"ContainerStarted","Data":"a7b68d32c6a1be3d1b63c4c0cc7b8825417f61ee0fb6f0e7084f5845530ba18c"} Mar 19 09:21:16.305022 master-0 kubenswrapper[7457]: I0319 09:21:16.304028 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 09:21:16.305022 master-0 kubenswrapper[7457]: I0319 09:21:16.304709 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.306928 master-0 kubenswrapper[7457]: I0319 09:21:16.306890 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 09:21:16.331575 master-0 kubenswrapper[7457]: I0319 09:21:16.327788 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 09:21:16.392618 master-0 kubenswrapper[7457]: I0319 09:21:16.392563 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-kube-api-access\") pod \"installer-1-master-0\" (UID: \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.392838 master-0 kubenswrapper[7457]: I0319 09:21:16.392799 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.393150 master-0 kubenswrapper[7457]: I0319 09:21:16.393112 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-var-lock\") pod \"installer-1-master-0\" (UID: \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.495752 master-0 kubenswrapper[7457]: I0319 09:21:16.493982 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-var-lock\") pod \"installer-1-master-0\" (UID: \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.495752 master-0 kubenswrapper[7457]: I0319 09:21:16.494033 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-kube-api-access\") pod \"installer-1-master-0\" (UID: \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.495752 master-0 kubenswrapper[7457]: I0319 09:21:16.494058 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.495752 master-0 kubenswrapper[7457]: I0319 09:21:16.494127 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.495752 master-0 kubenswrapper[7457]: I0319 09:21:16.494164 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-var-lock\") pod \"installer-1-master-0\" (UID: \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.531972 master-0 kubenswrapper[7457]: I0319 09:21:16.531901 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-kube-api-access\") pod \"installer-1-master-0\" (UID: \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.637053 master-0 kubenswrapper[7457]: I0319 09:21:16.636917 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.924868 master-0 kubenswrapper[7457]: I0319 09:21:16.924042 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=2.924026458 podStartE2EDuration="2.924026458s" podCreationTimestamp="2026-03-19 09:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:16.921890733 +0000 UTC m=+72.777230103" watchObservedRunningTime="2026-03-19 09:21:16.924026458 +0000 UTC m=+72.779365828" Mar 19 09:21:18.481353 master-0 kubenswrapper[7457]: I0319 09:21:18.481287 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 09:21:18.482344 master-0 kubenswrapper[7457]: I0319 09:21:18.481515 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="7fcfef1e-0652-4c52-a0a8-dfbf15408d03" containerName="installer" containerID="cri-o://e6e89ca8e17f62a01908661aaa2823a47bd6605150b0289a08fa08b57ffe48cb" gracePeriod=30 Mar 19 09:21:18.671085 master-0 kubenswrapper[7457]: I0319 09:21:18.670794 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f89774b7d-nrm4r"] Mar 19 09:21:18.671085 master-0 kubenswrapper[7457]: I0319 09:21:18.671072 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" podUID="c76b3023-dcc2-4ea3-b590-bf7fd718fc3f" containerName="controller-manager" containerID="cri-o://e78d3dce63ea732d696eb7bff751b92b4362afe268916ddac8100e99641f0a5b" gracePeriod=30 Mar 19 09:21:18.685383 master-0 kubenswrapper[7457]: I0319 09:21:18.685324 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn"] Mar 19 09:21:18.685756 master-0 kubenswrapper[7457]: I0319 09:21:18.685705 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" podUID="c3526fc7-b3af-4146-a791-fad627e8c9fa" containerName="route-controller-manager" containerID="cri-o://66bf409d4e408d987de5c230847e4aa6700de4c6d1fcbc684926aa5267864de3" gracePeriod=30 Mar 19 09:21:18.875575 master-0 kubenswrapper[7457]: E0319 09:21:18.873054 7457 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc76b3023_dcc2_4ea3_b590_bf7fd718fc3f.slice/crio-conmon-e78d3dce63ea732d696eb7bff751b92b4362afe268916ddac8100e99641f0a5b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod506da1fd_f439_4b94_9940_3531ae009af0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode03f97d1_b6fe_4fc9_8cb5_c97af7a651bb.slice/crio-2c0b681fce22722dc6bda98cf745e2b79d2558bec9534ca23b3f5d2d7fcdef7a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode03f97d1_b6fe_4fc9_8cb5_c97af7a651bb.slice/crio-conmon-2c0b681fce22722dc6bda98cf745e2b79d2558bec9534ca23b3f5d2d7fcdef7a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod506da1fd_f439_4b94_9940_3531ae009af0.slice/crio-28eab80a8fe32d784a906dc9171fdb8777758efe35f66d6344a6634d3ff92ae7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod506da1fd_f439_4b94_9940_3531ae009af0.slice/crio-conmon-40c2d0773871b0c8fab92e423550d9b8bdabe32991acae18be97a97bed1760e9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4abcf2ea_50f5_4d62_8a23_583438e5b451.slice/crio-conmon-0ca36c4228886afdfb6b80a61e2423dd76188b00985839f0f0ad53c1f5d31db7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod506da1fd_f439_4b94_9940_3531ae009af0.slice/crio-40c2d0773871b0c8fab92e423550d9b8bdabe32991acae18be97a97bed1760e9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4abcf2ea_50f5_4d62_8a23_583438e5b451.slice/crio-0ca36c4228886afdfb6b80a61e2423dd76188b00985839f0f0ad53c1f5d31db7.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:21:20.899793 master-0 kubenswrapper[7457]: I0319 09:21:20.899678 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 19 09:21:20.900766 master-0 kubenswrapper[7457]: I0319 09:21:20.900420 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:20.903047 master-0 kubenswrapper[7457]: I0319 09:21:20.903013 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-xzz4b" Mar 19 09:21:20.909749 master-0 kubenswrapper[7457]: I0319 09:21:20.909560 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 19 09:21:20.961537 master-0 kubenswrapper[7457]: I0319 09:21:20.961452 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/014ef8bd-b940-41e2-9239-c238afe6ebae-kube-api-access\") pod \"installer-4-master-0\" (UID: \"014ef8bd-b940-41e2-9239-c238afe6ebae\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:20.961716 master-0 kubenswrapper[7457]: I0319 09:21:20.961547 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/014ef8bd-b940-41e2-9239-c238afe6ebae-var-lock\") pod \"installer-4-master-0\" (UID: \"014ef8bd-b940-41e2-9239-c238afe6ebae\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:20.961716 master-0 kubenswrapper[7457]: I0319 09:21:20.961571 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/014ef8bd-b940-41e2-9239-c238afe6ebae-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"014ef8bd-b940-41e2-9239-c238afe6ebae\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:21.062245 master-0 kubenswrapper[7457]: I0319 09:21:21.062179 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/014ef8bd-b940-41e2-9239-c238afe6ebae-kube-api-access\") pod \"installer-4-master-0\" (UID: \"014ef8bd-b940-41e2-9239-c238afe6ebae\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:21.062245 master-0 kubenswrapper[7457]: I0319 09:21:21.062228 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/014ef8bd-b940-41e2-9239-c238afe6ebae-var-lock\") pod \"installer-4-master-0\" (UID: \"014ef8bd-b940-41e2-9239-c238afe6ebae\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:21.062245 master-0 kubenswrapper[7457]: I0319 09:21:21.062245 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/014ef8bd-b940-41e2-9239-c238afe6ebae-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"014ef8bd-b940-41e2-9239-c238afe6ebae\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:21.062643 master-0 kubenswrapper[7457]: I0319 09:21:21.062306 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/014ef8bd-b940-41e2-9239-c238afe6ebae-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"014ef8bd-b940-41e2-9239-c238afe6ebae\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:21.062698 master-0 kubenswrapper[7457]: I0319 09:21:21.062650 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/014ef8bd-b940-41e2-9239-c238afe6ebae-var-lock\") pod \"installer-4-master-0\" (UID: \"014ef8bd-b940-41e2-9239-c238afe6ebae\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:21.082998 master-0 kubenswrapper[7457]: I0319 09:21:21.082955 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/014ef8bd-b940-41e2-9239-c238afe6ebae-kube-api-access\") pod \"installer-4-master-0\" (UID: \"014ef8bd-b940-41e2-9239-c238afe6ebae\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:21.230431 master-0 kubenswrapper[7457]: I0319 09:21:21.230323 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:21.921250 master-0 kubenswrapper[7457]: I0319 09:21:21.921195 7457 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 09:21:21.921742 master-0 kubenswrapper[7457]: I0319 09:21:21.921435 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" containerID="cri-o://ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98" gracePeriod=30 Mar 19 09:21:21.921742 master-0 kubenswrapper[7457]: I0319 09:21:21.921598 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" containerID="cri-o://5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538" gracePeriod=30 Mar 19 09:21:21.924197 master-0 kubenswrapper[7457]: I0319 09:21:21.924159 7457 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:21:21.924429 master-0 kubenswrapper[7457]: E0319 09:21:21.924402 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 19 09:21:21.924429 master-0 kubenswrapper[7457]: I0319 09:21:21.924423 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 19 09:21:21.924491 master-0 kubenswrapper[7457]: E0319 09:21:21.924436 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 19 09:21:21.924491 master-0 kubenswrapper[7457]: I0319 09:21:21.924447 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 19 09:21:21.924671 master-0 kubenswrapper[7457]: I0319 09:21:21.924640 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 19 09:21:21.924705 master-0 kubenswrapper[7457]: I0319 09:21:21.924675 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 19 09:21:21.943071 master-0 kubenswrapper[7457]: I0319 09:21:21.943017 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 09:21:21.972450 master-0 kubenswrapper[7457]: I0319 09:21:21.972281 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:21.972450 master-0 kubenswrapper[7457]: I0319 09:21:21.972330 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:21.972450 master-0 kubenswrapper[7457]: I0319 09:21:21.972355 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:21.972450 master-0 kubenswrapper[7457]: I0319 09:21:21.972371 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:21.972450 master-0 kubenswrapper[7457]: I0319 09:21:21.972386 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:21.972450 master-0 kubenswrapper[7457]: I0319 09:21:21.972400 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.073699 master-0 kubenswrapper[7457]: I0319 09:21:22.073640 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.073699 master-0 kubenswrapper[7457]: I0319 09:21:22.073707 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.073960 master-0 kubenswrapper[7457]: I0319 09:21:22.073735 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.073960 master-0 kubenswrapper[7457]: I0319 09:21:22.073767 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.073960 master-0 kubenswrapper[7457]: I0319 09:21:22.073808 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.073960 master-0 kubenswrapper[7457]: I0319 09:21:22.073835 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.073960 master-0 kubenswrapper[7457]: I0319 09:21:22.073866 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.073960 master-0 kubenswrapper[7457]: I0319 09:21:22.073888 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.073960 master-0 kubenswrapper[7457]: I0319 09:21:22.073909 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.073960 master-0 kubenswrapper[7457]: I0319 09:21:22.073939 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.073960 master-0 kubenswrapper[7457]: I0319 09:21:22.073909 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.074267 master-0 kubenswrapper[7457]: I0319 09:21:22.073945 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:22.310560 master-0 kubenswrapper[7457]: I0319 09:21:22.310473 7457 patch_prober.go:28] interesting pod/controller-manager-6f89774b7d-nrm4r container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.40:8443/healthz\": dial tcp 10.128.0.40:8443: connect: connection refused" start-of-body= Mar 19 09:21:22.310978 master-0 kubenswrapper[7457]: I0319 09:21:22.310926 7457 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" podUID="c76b3023-dcc2-4ea3-b590-bf7fd718fc3f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.40:8443/healthz\": dial tcp 10.128.0.40:8443: connect: connection refused" Mar 19 09:21:23.931821 master-0 kubenswrapper[7457]: I0319 09:21:23.931513 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_06b63f3f-ca62-4195-80e9-7ff427e1c58b/installer/0.log" Mar 19 09:21:23.932437 master-0 kubenswrapper[7457]: I0319 09:21:23.931826 7457 generic.go:334] "Generic (PLEG): container finished" podID="06b63f3f-ca62-4195-80e9-7ff427e1c58b" containerID="724d8a9b85240c0b1df62f7319e7755ef432c021c652343b5814cdc6b0afd1ef" exitCode=1 Mar 19 09:21:23.932437 master-0 kubenswrapper[7457]: I0319 09:21:23.931877 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"06b63f3f-ca62-4195-80e9-7ff427e1c58b","Type":"ContainerDied","Data":"724d8a9b85240c0b1df62f7319e7755ef432c021c652343b5814cdc6b0afd1ef"} Mar 19 09:21:23.933710 master-0 kubenswrapper[7457]: I0319 09:21:23.933679 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_7fcfef1e-0652-4c52-a0a8-dfbf15408d03/installer/0.log" Mar 19 09:21:23.933781 master-0 kubenswrapper[7457]: I0319 09:21:23.933713 7457 generic.go:334] "Generic (PLEG): container finished" podID="7fcfef1e-0652-4c52-a0a8-dfbf15408d03" containerID="e6e89ca8e17f62a01908661aaa2823a47bd6605150b0289a08fa08b57ffe48cb" exitCode=1 Mar 19 09:21:23.933781 master-0 kubenswrapper[7457]: I0319 09:21:23.933751 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7fcfef1e-0652-4c52-a0a8-dfbf15408d03","Type":"ContainerDied","Data":"e6e89ca8e17f62a01908661aaa2823a47bd6605150b0289a08fa08b57ffe48cb"} Mar 19 09:21:23.935084 master-0 kubenswrapper[7457]: I0319 09:21:23.935060 7457 generic.go:334] "Generic (PLEG): container finished" podID="c76b3023-dcc2-4ea3-b590-bf7fd718fc3f" containerID="e78d3dce63ea732d696eb7bff751b92b4362afe268916ddac8100e99641f0a5b" exitCode=0 Mar 19 09:21:23.935137 master-0 kubenswrapper[7457]: I0319 09:21:23.935109 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" event={"ID":"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f","Type":"ContainerDied","Data":"e78d3dce63ea732d696eb7bff751b92b4362afe268916ddac8100e99641f0a5b"} Mar 19 09:21:23.936451 master-0 kubenswrapper[7457]: I0319 09:21:23.936420 7457 generic.go:334] "Generic (PLEG): container finished" podID="c3526fc7-b3af-4146-a791-fad627e8c9fa" containerID="66bf409d4e408d987de5c230847e4aa6700de4c6d1fcbc684926aa5267864de3" exitCode=0 Mar 19 09:21:23.936451 master-0 kubenswrapper[7457]: I0319 09:21:23.936450 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" event={"ID":"c3526fc7-b3af-4146-a791-fad627e8c9fa","Type":"ContainerDied","Data":"66bf409d4e408d987de5c230847e4aa6700de4c6d1fcbc684926aa5267864de3"} Mar 19 09:21:25.953461 master-0 kubenswrapper[7457]: I0319 09:21:25.953406 7457 patch_prober.go:28] interesting pod/route-controller-manager-597786f6d8-qsfjn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.41:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:21:25.954294 master-0 kubenswrapper[7457]: I0319 09:21:25.953504 7457 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" podUID="c3526fc7-b3af-4146-a791-fad627e8c9fa" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.41:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:21:26.793467 master-0 kubenswrapper[7457]: I0319 09:21:26.793422 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_7fcfef1e-0652-4c52-a0a8-dfbf15408d03/installer/0.log" Mar 19 09:21:26.793622 master-0 kubenswrapper[7457]: I0319 09:21:26.793565 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:26.800645 master-0 kubenswrapper[7457]: I0319 09:21:26.797424 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_06b63f3f-ca62-4195-80e9-7ff427e1c58b/installer/0.log" Mar 19 09:21:26.800645 master-0 kubenswrapper[7457]: I0319 09:21:26.797539 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:21:26.801421 master-0 kubenswrapper[7457]: I0319 09:21:26.801390 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:26.805335 master-0 kubenswrapper[7457]: I0319 09:21:26.805294 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:26.932707 master-0 kubenswrapper[7457]: I0319 09:21:26.931543 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ndzd\" (UniqueName: \"kubernetes.io/projected/c3526fc7-b3af-4146-a791-fad627e8c9fa-kube-api-access-6ndzd\") pod \"c3526fc7-b3af-4146-a791-fad627e8c9fa\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.932786 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-client-ca\") pod \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.932842 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-kube-api-access\") pod \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\" (UID: \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.932870 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-var-lock\") pod \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\" (UID: \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.932902 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06b63f3f-ca62-4195-80e9-7ff427e1c58b-kube-api-access\") pod \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\" (UID: \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.932928 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-serving-cert\") pod \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.932948 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3526fc7-b3af-4146-a791-fad627e8c9fa-serving-cert\") pod \"c3526fc7-b3af-4146-a791-fad627e8c9fa\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.932967 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-config\") pod \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.932988 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3526fc7-b3af-4146-a791-fad627e8c9fa-config\") pod \"c3526fc7-b3af-4146-a791-fad627e8c9fa\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.933007 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-kubelet-dir\") pod \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\" (UID: \"7fcfef1e-0652-4c52-a0a8-dfbf15408d03\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.933031 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f6sk\" (UniqueName: \"kubernetes.io/projected/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-kube-api-access-2f6sk\") pod \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.933053 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06b63f3f-ca62-4195-80e9-7ff427e1c58b-kubelet-dir\") pod \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\" (UID: \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.933085 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/06b63f3f-ca62-4195-80e9-7ff427e1c58b-var-lock\") pod \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\" (UID: \"06b63f3f-ca62-4195-80e9-7ff427e1c58b\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.933111 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-proxy-ca-bundles\") pod \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\" (UID: \"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f\") " Mar 19 09:21:26.933375 master-0 kubenswrapper[7457]: I0319 09:21:26.933135 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3526fc7-b3af-4146-a791-fad627e8c9fa-client-ca\") pod \"c3526fc7-b3af-4146-a791-fad627e8c9fa\" (UID: \"c3526fc7-b3af-4146-a791-fad627e8c9fa\") " Mar 19 09:21:26.934139 master-0 kubenswrapper[7457]: I0319 09:21:26.933959 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-var-lock" (OuterVolumeSpecName: "var-lock") pod "7fcfef1e-0652-4c52-a0a8-dfbf15408d03" (UID: "7fcfef1e-0652-4c52-a0a8-dfbf15408d03"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:26.935083 master-0 kubenswrapper[7457]: I0319 09:21:26.934730 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-client-ca" (OuterVolumeSpecName: "client-ca") pod "c76b3023-dcc2-4ea3-b590-bf7fd718fc3f" (UID: "c76b3023-dcc2-4ea3-b590-bf7fd718fc3f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:26.935083 master-0 kubenswrapper[7457]: I0319 09:21:26.934780 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06b63f3f-ca62-4195-80e9-7ff427e1c58b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "06b63f3f-ca62-4195-80e9-7ff427e1c58b" (UID: "06b63f3f-ca62-4195-80e9-7ff427e1c58b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:26.935083 master-0 kubenswrapper[7457]: I0319 09:21:26.934804 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3526fc7-b3af-4146-a791-fad627e8c9fa-client-ca" (OuterVolumeSpecName: "client-ca") pod "c3526fc7-b3af-4146-a791-fad627e8c9fa" (UID: "c3526fc7-b3af-4146-a791-fad627e8c9fa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:26.935083 master-0 kubenswrapper[7457]: I0319 09:21:26.934815 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06b63f3f-ca62-4195-80e9-7ff427e1c58b-var-lock" (OuterVolumeSpecName: "var-lock") pod "06b63f3f-ca62-4195-80e9-7ff427e1c58b" (UID: "06b63f3f-ca62-4195-80e9-7ff427e1c58b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:26.935250 master-0 kubenswrapper[7457]: I0319 09:21:26.935103 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7fcfef1e-0652-4c52-a0a8-dfbf15408d03" (UID: "7fcfef1e-0652-4c52-a0a8-dfbf15408d03"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:26.935250 master-0 kubenswrapper[7457]: I0319 09:21:26.935193 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c76b3023-dcc2-4ea3-b590-bf7fd718fc3f" (UID: "c76b3023-dcc2-4ea3-b590-bf7fd718fc3f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:26.935634 master-0 kubenswrapper[7457]: I0319 09:21:26.935553 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3526fc7-b3af-4146-a791-fad627e8c9fa-config" (OuterVolumeSpecName: "config") pod "c3526fc7-b3af-4146-a791-fad627e8c9fa" (UID: "c3526fc7-b3af-4146-a791-fad627e8c9fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:26.935634 master-0 kubenswrapper[7457]: I0319 09:21:26.935615 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-config" (OuterVolumeSpecName: "config") pod "c76b3023-dcc2-4ea3-b590-bf7fd718fc3f" (UID: "c76b3023-dcc2-4ea3-b590-bf7fd718fc3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:26.938017 master-0 kubenswrapper[7457]: I0319 09:21:26.937910 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3526fc7-b3af-4146-a791-fad627e8c9fa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c3526fc7-b3af-4146-a791-fad627e8c9fa" (UID: "c3526fc7-b3af-4146-a791-fad627e8c9fa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:21:26.939898 master-0 kubenswrapper[7457]: I0319 09:21:26.939837 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-kube-api-access-2f6sk" (OuterVolumeSpecName: "kube-api-access-2f6sk") pod "c76b3023-dcc2-4ea3-b590-bf7fd718fc3f" (UID: "c76b3023-dcc2-4ea3-b590-bf7fd718fc3f"). InnerVolumeSpecName "kube-api-access-2f6sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:26.941725 master-0 kubenswrapper[7457]: I0319 09:21:26.941655 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c76b3023-dcc2-4ea3-b590-bf7fd718fc3f" (UID: "c76b3023-dcc2-4ea3-b590-bf7fd718fc3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:21:26.942664 master-0 kubenswrapper[7457]: I0319 09:21:26.942499 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7fcfef1e-0652-4c52-a0a8-dfbf15408d03" (UID: "7fcfef1e-0652-4c52-a0a8-dfbf15408d03"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:26.944034 master-0 kubenswrapper[7457]: I0319 09:21:26.943988 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3526fc7-b3af-4146-a791-fad627e8c9fa-kube-api-access-6ndzd" (OuterVolumeSpecName: "kube-api-access-6ndzd") pod "c3526fc7-b3af-4146-a791-fad627e8c9fa" (UID: "c3526fc7-b3af-4146-a791-fad627e8c9fa"). InnerVolumeSpecName "kube-api-access-6ndzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:26.944759 master-0 kubenswrapper[7457]: I0319 09:21:26.944614 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b63f3f-ca62-4195-80e9-7ff427e1c58b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "06b63f3f-ca62-4195-80e9-7ff427e1c58b" (UID: "06b63f3f-ca62-4195-80e9-7ff427e1c58b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:26.969023 master-0 kubenswrapper[7457]: I0319 09:21:26.968847 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_06b63f3f-ca62-4195-80e9-7ff427e1c58b/installer/0.log" Mar 19 09:21:26.969023 master-0 kubenswrapper[7457]: I0319 09:21:26.968915 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"06b63f3f-ca62-4195-80e9-7ff427e1c58b","Type":"ContainerDied","Data":"238008c11c92aeb65c62f807e37180e81b3384c3527719ecefddec5721f48b97"} Mar 19 09:21:26.969023 master-0 kubenswrapper[7457]: I0319 09:21:26.968951 7457 scope.go:117] "RemoveContainer" containerID="724d8a9b85240c0b1df62f7319e7755ef432c021c652343b5814cdc6b0afd1ef" Mar 19 09:21:26.969901 master-0 kubenswrapper[7457]: I0319 09:21:26.969037 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:21:26.990997 master-0 kubenswrapper[7457]: I0319 09:21:26.988461 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_7fcfef1e-0652-4c52-a0a8-dfbf15408d03/installer/0.log" Mar 19 09:21:26.990997 master-0 kubenswrapper[7457]: I0319 09:21:26.988562 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7fcfef1e-0652-4c52-a0a8-dfbf15408d03","Type":"ContainerDied","Data":"a7b68d32c6a1be3d1b63c4c0cc7b8825417f61ee0fb6f0e7084f5845530ba18c"} Mar 19 09:21:26.990997 master-0 kubenswrapper[7457]: I0319 09:21:26.988633 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:27.000245 master-0 kubenswrapper[7457]: I0319 09:21:27.000196 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" event={"ID":"c76b3023-dcc2-4ea3-b590-bf7fd718fc3f","Type":"ContainerDied","Data":"0c33816ff9a691d939d3249416d2119d08c88e42a7df1593bf891ba67f33f9b1"} Mar 19 09:21:27.000318 master-0 kubenswrapper[7457]: I0319 09:21:27.000301 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f89774b7d-nrm4r" Mar 19 09:21:27.012214 master-0 kubenswrapper[7457]: I0319 09:21:27.011797 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" event={"ID":"c3526fc7-b3af-4146-a791-fad627e8c9fa","Type":"ContainerDied","Data":"e391dcffd584ac2ff23381e87dc61df1ca458dbf6b1c4be91fe138eb91648bf7"} Mar 19 09:21:27.012214 master-0 kubenswrapper[7457]: I0319 09:21:27.011926 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.033992 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ndzd\" (UniqueName: \"kubernetes.io/projected/c3526fc7-b3af-4146-a791-fad627e8c9fa-kube-api-access-6ndzd\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034021 7457 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034031 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034039 7457 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034048 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06b63f3f-ca62-4195-80e9-7ff427e1c58b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034058 7457 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034072 7457 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3526fc7-b3af-4146-a791-fad627e8c9fa-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034113 7457 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034125 7457 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3526fc7-b3af-4146-a791-fad627e8c9fa-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034137 7457 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fcfef1e-0652-4c52-a0a8-dfbf15408d03-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034148 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f6sk\" (UniqueName: \"kubernetes.io/projected/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-kube-api-access-2f6sk\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034158 7457 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06b63f3f-ca62-4195-80e9-7ff427e1c58b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034169 7457 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/06b63f3f-ca62-4195-80e9-7ff427e1c58b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034180 7457 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.034510 master-0 kubenswrapper[7457]: I0319 09:21:27.034192 7457 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3526fc7-b3af-4146-a791-fad627e8c9fa-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:27.045067 master-0 kubenswrapper[7457]: I0319 09:21:27.044673 7457 scope.go:117] "RemoveContainer" containerID="e6e89ca8e17f62a01908661aaa2823a47bd6605150b0289a08fa08b57ffe48cb" Mar 19 09:21:27.141172 master-0 kubenswrapper[7457]: I0319 09:21:27.140970 7457 scope.go:117] "RemoveContainer" containerID="e78d3dce63ea732d696eb7bff751b92b4362afe268916ddac8100e99641f0a5b" Mar 19 09:21:27.171938 master-0 kubenswrapper[7457]: I0319 09:21:27.171391 7457 scope.go:117] "RemoveContainer" containerID="66bf409d4e408d987de5c230847e4aa6700de4c6d1fcbc684926aa5267864de3" Mar 19 09:21:28.026568 master-0 kubenswrapper[7457]: I0319 09:21:28.025868 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" event={"ID":"16d2930b-486b-492d-983e-c6702d8f53a7","Type":"ContainerStarted","Data":"a5186430fb985d7ebb15a1a1f2a6af42201f36c0a215d80ae48f52255c02b6b0"} Mar 19 09:21:28.026568 master-0 kubenswrapper[7457]: I0319 09:21:28.026474 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" event={"ID":"16d2930b-486b-492d-983e-c6702d8f53a7","Type":"ContainerStarted","Data":"356b06972093c2085b6c093c1670412d09d2175a5b8cb02c15b08e69301fe5f1"} Mar 19 09:21:28.027619 master-0 kubenswrapper[7457]: I0319 09:21:28.027224 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" event={"ID":"9076d131-644a-4332-8a70-34f6b0f71575","Type":"ContainerStarted","Data":"4bddbe0181ee7be4a4759326c4ea480ecc4661debbd026d4925858f87d0a1138"} Mar 19 09:21:28.029987 master-0 kubenswrapper[7457]: I0319 09:21:28.029739 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" event={"ID":"51b88818-5108-40db-90c8-4f2e7198959e","Type":"ContainerStarted","Data":"caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9"} Mar 19 09:21:28.032222 master-0 kubenswrapper[7457]: I0319 09:21:28.032185 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" event={"ID":"03d12dab-1215-4c1f-a9f5-27ea7174d308","Type":"ContainerStarted","Data":"8329f09fba15557facf0fb8240366aaf172da109e0392af8985f771239026963"} Mar 19 09:21:28.032222 master-0 kubenswrapper[7457]: I0319 09:21:28.032220 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" event={"ID":"03d12dab-1215-4c1f-a9f5-27ea7174d308","Type":"ContainerStarted","Data":"761c314b9fe9ef180b6271814e6fada9d11c95db600b4fe44a34208c56ddcebd"} Mar 19 09:21:28.035562 master-0 kubenswrapper[7457]: I0319 09:21:28.035504 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" event={"ID":"7b29cb7b-26d2-4fab-9e03-2d7fdf937592","Type":"ContainerStarted","Data":"2aae1324ea9ac71e757c6b6742bbfe17bf26ff22a4f1597837954f981813c18e"} Mar 19 09:21:28.035744 master-0 kubenswrapper[7457]: I0319 09:21:28.035715 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:21:28.037519 master-0 kubenswrapper[7457]: I0319 09:21:28.037480 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" event={"ID":"8c8ee765-76b8-4cde-8acb-6e5edd1b8149","Type":"ContainerStarted","Data":"63b23bb8d76ed0e926a1343679e4f9708af3d51ac77a25af6ed81f49e628c18a"} Mar 19 09:21:28.039237 master-0 kubenswrapper[7457]: I0319 09:21:28.039198 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" event={"ID":"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e","Type":"ContainerStarted","Data":"d4be4fce578d40cec533c0ab0b2ea7b1d2f8bbfad85eab154b0c3268083f1916"} Mar 19 09:21:28.039450 master-0 kubenswrapper[7457]: I0319 09:21:28.039408 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:21:28.040551 master-0 kubenswrapper[7457]: I0319 09:21:28.040504 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:21:28.045551 master-0 kubenswrapper[7457]: I0319 09:21:28.045476 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:21:28.047621 master-0 kubenswrapper[7457]: I0319 09:21:28.047573 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" event={"ID":"8beda3a0-a653-4810-b3f2-d25badb21ab1","Type":"ContainerStarted","Data":"5a1232e74d2b81fa0fb089837e46ec811c58ea20165c36d4de9800956bf481df"} Mar 19 09:21:28.047688 master-0 kubenswrapper[7457]: I0319 09:21:28.047630 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" event={"ID":"8beda3a0-a653-4810-b3f2-d25badb21ab1","Type":"ContainerStarted","Data":"ec2a8f37c4a4bf290761beb86b8148cabc7c9a7b8241accf763dd14e9ad11acc"} Mar 19 09:21:28.050160 master-0 kubenswrapper[7457]: I0319 09:21:28.050109 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nq9vs" event={"ID":"13072c08-c77c-4170-9ebe-98d63968747b","Type":"ContainerStarted","Data":"245d99e7883ffc07a4a8ceaedfa4d89a79c2e00aa8c866229503a95019ecbd07"} Mar 19 09:21:28.050160 master-0 kubenswrapper[7457]: I0319 09:21:28.050154 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nq9vs" event={"ID":"13072c08-c77c-4170-9ebe-98d63968747b","Type":"ContainerStarted","Data":"c8cfb266b227f142294c82f73924657272f28ebc09cb093d39a161a875872cb3"} Mar 19 09:21:28.052751 master-0 kubenswrapper[7457]: I0319 09:21:28.052704 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" event={"ID":"9a6c1523-e77c-4aac-814c-05d41215c42f","Type":"ContainerStarted","Data":"6eac1964a5e72aa12c65bafd864366d67117587af3993a267a04a961b129a449"} Mar 19 09:21:28.052926 master-0 kubenswrapper[7457]: I0319 09:21:28.052887 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:21:28.057593 master-0 kubenswrapper[7457]: I0319 09:21:28.057479 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" event={"ID":"259794ab-d027-497a-b08e-5a6d79057668","Type":"ContainerStarted","Data":"0bb6d4411c90b21c40d2ebf35d55a831d972d567e97bde63d3acc0f2997756c7"} Mar 19 09:21:28.058021 master-0 kubenswrapper[7457]: I0319 09:21:28.057977 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:21:28.061689 master-0 kubenswrapper[7457]: I0319 09:21:28.060671 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" event={"ID":"c247d991-809e-46b6-9617-9b05007b7560","Type":"ContainerStarted","Data":"454861dcc46302037af52899fc563555f9ba3061ac1f1a6fd669ea5cd9d7f5b8"} Mar 19 09:21:28.062608 master-0 kubenswrapper[7457]: I0319 09:21:28.062573 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:21:34.425005 master-0 kubenswrapper[7457]: E0319 09:21:34.424908 7457 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:21:34.974974 master-0 kubenswrapper[7457]: E0319 09:21:34.974877 7457 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 09:21:34.975349 master-0 kubenswrapper[7457]: I0319 09:21:34.975313 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 09:21:34.997416 master-0 kubenswrapper[7457]: W0319 09:21:34.997253 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b4ed170d527099878cb5fdd508a2fb.slice/crio-7cd4cfeb35d9cb7e8bc213abe4e5f2a9ecc0b4807e7e9244214faaeba9632ab5 WatchSource:0}: Error finding container 7cd4cfeb35d9cb7e8bc213abe4e5f2a9ecc0b4807e7e9244214faaeba9632ab5: Status 404 returned error can't find the container with id 7cd4cfeb35d9cb7e8bc213abe4e5f2a9ecc0b4807e7e9244214faaeba9632ab5 Mar 19 09:21:35.100091 master-0 kubenswrapper[7457]: I0319 09:21:35.100038 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"7cd4cfeb35d9cb7e8bc213abe4e5f2a9ecc0b4807e7e9244214faaeba9632ab5"} Mar 19 09:21:36.105405 master-0 kubenswrapper[7457]: I0319 09:21:36.105128 7457 generic.go:334] "Generic (PLEG): container finished" podID="c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a" containerID="301aebb0e9930fecf725f0201f719e1159eb2c1c4f88b41cf02dfb10a0bbec0d" exitCode=0 Mar 19 09:21:36.106060 master-0 kubenswrapper[7457]: I0319 09:21:36.105239 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a","Type":"ContainerDied","Data":"301aebb0e9930fecf725f0201f719e1159eb2c1c4f88b41cf02dfb10a0bbec0d"} Mar 19 09:21:36.107465 master-0 kubenswrapper[7457]: I0319 09:21:36.107363 7457 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0" exitCode=0 Mar 19 09:21:36.107465 master-0 kubenswrapper[7457]: I0319 09:21:36.107425 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0"} Mar 19 09:21:36.111989 master-0 kubenswrapper[7457]: I0319 09:21:36.109752 7457 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="a74f0437d5a92c82edd9e58f193503c363594aaca67bff5a5ae6fcd1a5a28477" exitCode=1 Mar 19 09:21:36.111989 master-0 kubenswrapper[7457]: I0319 09:21:36.109983 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"a74f0437d5a92c82edd9e58f193503c363594aaca67bff5a5ae6fcd1a5a28477"} Mar 19 09:21:36.111989 master-0 kubenswrapper[7457]: I0319 09:21:36.110008 7457 scope.go:117] "RemoveContainer" containerID="382712d4a8a720b54161d083c15e892932ef38c413a22bb647480e2f84ff33a9" Mar 19 09:21:36.111989 master-0 kubenswrapper[7457]: I0319 09:21:36.110372 7457 scope.go:117] "RemoveContainer" containerID="a74f0437d5a92c82edd9e58f193503c363594aaca67bff5a5ae6fcd1a5a28477" Mar 19 09:21:37.118341 master-0 kubenswrapper[7457]: I0319 09:21:37.118293 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"b5a024c432c7340543e69f0bd7bb2379e363c0a3445c80c57fd287fd74ddf6ae"} Mar 19 09:21:37.336820 master-0 kubenswrapper[7457]: I0319 09:21:37.336482 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:37.458452 master-0 kubenswrapper[7457]: I0319 09:21:37.458308 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-kube-api-access\") pod \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\" (UID: \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\") " Mar 19 09:21:37.458452 master-0 kubenswrapper[7457]: I0319 09:21:37.458402 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-var-lock\") pod \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\" (UID: \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\") " Mar 19 09:21:37.458452 master-0 kubenswrapper[7457]: I0319 09:21:37.458439 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-kubelet-dir\") pod \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\" (UID: \"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a\") " Mar 19 09:21:37.458933 master-0 kubenswrapper[7457]: I0319 09:21:37.458608 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-var-lock" (OuterVolumeSpecName: "var-lock") pod "c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a" (UID: "c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:37.458933 master-0 kubenswrapper[7457]: I0319 09:21:37.458668 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a" (UID: "c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:37.463790 master-0 kubenswrapper[7457]: I0319 09:21:37.463727 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a" (UID: "c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:37.559710 master-0 kubenswrapper[7457]: I0319 09:21:37.559648 7457 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:37.560451 master-0 kubenswrapper[7457]: I0319 09:21:37.560427 7457 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:37.560618 master-0 kubenswrapper[7457]: I0319 09:21:37.560598 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:38.126330 master-0 kubenswrapper[7457]: I0319 09:21:38.126243 7457 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="9a59b0cbe8ea8fa4b17a290e74267cd3c1f43f118142de7e624d510bbb389da7" exitCode=1 Mar 19 09:21:38.127402 master-0 kubenswrapper[7457]: I0319 09:21:38.126369 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerDied","Data":"9a59b0cbe8ea8fa4b17a290e74267cd3c1f43f118142de7e624d510bbb389da7"} Mar 19 09:21:38.127402 master-0 kubenswrapper[7457]: I0319 09:21:38.126832 7457 scope.go:117] "RemoveContainer" containerID="9a59b0cbe8ea8fa4b17a290e74267cd3c1f43f118142de7e624d510bbb389da7" Mar 19 09:21:38.128603 master-0 kubenswrapper[7457]: I0319 09:21:38.128540 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a","Type":"ContainerDied","Data":"5102e60e07b0c0187e422871eca34d56a4b64890354534ac5bb4405ed5a663d3"} Mar 19 09:21:38.128603 master-0 kubenswrapper[7457]: I0319 09:21:38.128578 7457 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5102e60e07b0c0187e422871eca34d56a4b64890354534ac5bb4405ed5a663d3" Mar 19 09:21:38.128777 master-0 kubenswrapper[7457]: I0319 09:21:38.128657 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:38.227376 master-0 kubenswrapper[7457]: E0319 09:21:38.227273 7457 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:21:28Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:21:28Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:21:28Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:21:28Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\\\"],\\\"sizeBytes\\\":465090934},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9609c00207cc4db97f0fd6162eb429d7f81654137f020a677e30cba26a887a24\\\"],\\\"sizeBytes\\\":463705930},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85\\\"],\\\"sizeBytes\\\":448042136},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951ecfeba9b2da4b653034d09275f925396a79c2d8461b8a7c71c776fee67ba0\\\"],\\\"sizeBytes\\\":443272037},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e66fd50be6f83ce321a566dfb76f3725b597374077d5af13813b928f6b1267e\\\"],\\\"sizeBytes\\\":411587146},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3a494212f1ba17f0f0980eef583218330eccb56eadf6b8cb0548c76d99b5014\\\"],\\\"sizeBytes\\\":407347125},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422\\\"],\\\"sizeBytes\\\":396521761}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:21:39.136262 master-0 kubenswrapper[7457]: I0319 09:21:39.135980 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"a80a075ae2d2bfe0e545df390d9ff0ad18516cad1ed3ad4a716e570d8e5f21c1"} Mar 19 09:21:39.293167 master-0 kubenswrapper[7457]: I0319 09:21:39.293104 7457 patch_prober.go:28] interesting pod/etcd-operator-8544cbcf9c-5bddk container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.11:8443/healthz\": dial tcp 10.128.0.11:8443: connect: connection refused" start-of-body= Mar 19 09:21:39.293167 master-0 kubenswrapper[7457]: I0319 09:21:39.293161 7457 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" podUID="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.11:8443/healthz\": dial tcp 10.128.0.11:8443: connect: connection refused" Mar 19 09:21:41.951822 master-0 kubenswrapper[7457]: I0319 09:21:41.951763 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:21:43.159256 master-0 kubenswrapper[7457]: I0319 09:21:43.159192 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-v9898_8527f5cd-2992-44be-90b8-e9086cedf46e/openshift-controller-manager-operator/0.log" Mar 19 09:21:43.159256 master-0 kubenswrapper[7457]: I0319 09:21:43.159243 7457 generic.go:334] "Generic (PLEG): container finished" podID="8527f5cd-2992-44be-90b8-e9086cedf46e" containerID="c9da4601818f501772c5c387239e3219ab4432a2bb45b7271b716c82c40ddaf7" exitCode=1 Mar 19 09:21:43.160081 master-0 kubenswrapper[7457]: I0319 09:21:43.159272 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" event={"ID":"8527f5cd-2992-44be-90b8-e9086cedf46e","Type":"ContainerDied","Data":"c9da4601818f501772c5c387239e3219ab4432a2bb45b7271b716c82c40ddaf7"} Mar 19 09:21:43.160081 master-0 kubenswrapper[7457]: I0319 09:21:43.159617 7457 scope.go:117] "RemoveContainer" containerID="c9da4601818f501772c5c387239e3219ab4432a2bb45b7271b716c82c40ddaf7" Mar 19 09:21:44.169519 master-0 kubenswrapper[7457]: I0319 09:21:44.169200 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-v9898_8527f5cd-2992-44be-90b8-e9086cedf46e/openshift-controller-manager-operator/0.log" Mar 19 09:21:44.170263 master-0 kubenswrapper[7457]: I0319 09:21:44.169569 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" event={"ID":"8527f5cd-2992-44be-90b8-e9086cedf46e","Type":"ContainerStarted","Data":"a834f0cd8fb684d8e2dd9495126e065c4a7492369ef2541c40f6fa239498416e"} Mar 19 09:21:44.426411 master-0 kubenswrapper[7457]: E0319 09:21:44.426209 7457 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:21:44.814336 master-0 kubenswrapper[7457]: I0319 09:21:44.814235 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:21:47.815500 master-0 kubenswrapper[7457]: I0319 09:21:47.815381 7457 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:21:48.228148 master-0 kubenswrapper[7457]: E0319 09:21:48.227936 7457 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:21:49.113491 master-0 kubenswrapper[7457]: E0319 09:21:49.113415 7457 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 09:21:49.302181 master-0 kubenswrapper[7457]: E0319 09:21:49.302131 7457 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd664a6d0d2a24360dee10612610f1b59.slice/crio-conmon-5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd664a6d0d2a24360dee10612610f1b59.slice/crio-5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:21:50.218689 master-0 kubenswrapper[7457]: I0319 09:21:50.218584 7457 generic.go:334] "Generic (PLEG): container finished" podID="d664a6d0d2a24360dee10612610f1b59" containerID="5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538" exitCode=0 Mar 19 09:21:50.221274 master-0 kubenswrapper[7457]: I0319 09:21:50.221239 7457 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360" exitCode=0 Mar 19 09:21:50.221362 master-0 kubenswrapper[7457]: I0319 09:21:50.221278 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360"} Mar 19 09:21:52.055819 master-0 kubenswrapper[7457]: I0319 09:21:52.055773 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_d664a6d0d2a24360dee10612610f1b59/etcdctl/0.log" Mar 19 09:21:52.056190 master-0 kubenswrapper[7457]: I0319 09:21:52.055863 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:21:52.233215 master-0 kubenswrapper[7457]: I0319 09:21:52.233110 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_d664a6d0d2a24360dee10612610f1b59/etcdctl/0.log" Mar 19 09:21:52.233215 master-0 kubenswrapper[7457]: I0319 09:21:52.233157 7457 generic.go:334] "Generic (PLEG): container finished" podID="d664a6d0d2a24360dee10612610f1b59" containerID="ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98" exitCode=137 Mar 19 09:21:52.233215 master-0 kubenswrapper[7457]: I0319 09:21:52.233201 7457 scope.go:117] "RemoveContainer" containerID="5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538" Mar 19 09:21:52.233559 master-0 kubenswrapper[7457]: I0319 09:21:52.233303 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:21:52.235037 master-0 kubenswrapper[7457]: I0319 09:21:52.234976 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"d664a6d0d2a24360dee10612610f1b59\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " Mar 19 09:21:52.235150 master-0 kubenswrapper[7457]: I0319 09:21:52.235092 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"d664a6d0d2a24360dee10612610f1b59\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " Mar 19 09:21:52.235279 master-0 kubenswrapper[7457]: I0319 09:21:52.235234 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs" (OuterVolumeSpecName: "certs") pod "d664a6d0d2a24360dee10612610f1b59" (UID: "d664a6d0d2a24360dee10612610f1b59"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:52.235279 master-0 kubenswrapper[7457]: I0319 09:21:52.235252 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir" (OuterVolumeSpecName: "data-dir") pod "d664a6d0d2a24360dee10612610f1b59" (UID: "d664a6d0d2a24360dee10612610f1b59"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:52.246551 master-0 kubenswrapper[7457]: I0319 09:21:52.246493 7457 scope.go:117] "RemoveContainer" containerID="ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98" Mar 19 09:21:52.268816 master-0 kubenswrapper[7457]: I0319 09:21:52.268770 7457 scope.go:117] "RemoveContainer" containerID="5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538" Mar 19 09:21:52.269379 master-0 kubenswrapper[7457]: E0319 09:21:52.269331 7457 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538\": container with ID starting with 5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538 not found: ID does not exist" containerID="5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538" Mar 19 09:21:52.269469 master-0 kubenswrapper[7457]: I0319 09:21:52.269386 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538"} err="failed to get container status \"5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538\": rpc error: code = NotFound desc = could not find container \"5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538\": container with ID starting with 5d8633d587af247c09420289c37e602ebd710d9ba8cae57212bb190c4bbb2538 not found: ID does not exist" Mar 19 09:21:52.269469 master-0 kubenswrapper[7457]: I0319 09:21:52.269467 7457 scope.go:117] "RemoveContainer" containerID="ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98" Mar 19 09:21:52.270102 master-0 kubenswrapper[7457]: E0319 09:21:52.270058 7457 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98\": container with ID starting with ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98 not found: ID does not exist" containerID="ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98" Mar 19 09:21:52.270142 master-0 kubenswrapper[7457]: I0319 09:21:52.270111 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98"} err="failed to get container status \"ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98\": rpc error: code = NotFound desc = could not find container \"ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98\": container with ID starting with ef092117b7265d7ec9f7f9776ce01191a3d05b7b5055152382f48cf10fc7df98 not found: ID does not exist" Mar 19 09:21:52.336761 master-0 kubenswrapper[7457]: I0319 09:21:52.336618 7457 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:52.336761 master-0 kubenswrapper[7457]: I0319 09:21:52.336693 7457 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:52.343116 master-0 kubenswrapper[7457]: I0319 09:21:52.343056 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d664a6d0d2a24360dee10612610f1b59" path="/var/lib/kubelet/pods/d664a6d0d2a24360dee10612610f1b59/volumes" Mar 19 09:21:52.343551 master-0 kubenswrapper[7457]: I0319 09:21:52.343496 7457 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 09:21:54.427308 master-0 kubenswrapper[7457]: E0319 09:21:54.426956 7457 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:21:55.253746 master-0 kubenswrapper[7457]: I0319 09:21:55.253676 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_434aabfa-50db-407e-92d3-a034696613e3/installer/0.log" Mar 19 09:21:55.253746 master-0 kubenswrapper[7457]: I0319 09:21:55.253742 7457 generic.go:334] "Generic (PLEG): container finished" podID="434aabfa-50db-407e-92d3-a034696613e3" containerID="5a3b40e5aadf949e686ac4f447f2417ca9edf3ac74f9cc8e180b0ad3fbdc1cbc" exitCode=1 Mar 19 09:21:55.948751 master-0 kubenswrapper[7457]: E0319 09:21:55.948515 7457 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e33a3d1b5c513 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:21:21.921590547 +0000 UTC m=+77.776929917,LastTimestamp:2026-03-19 09:21:21.921590547 +0000 UTC m=+77.776929917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:21:57.168193 master-0 kubenswrapper[7457]: I0319 09:21:57.168125 7457 patch_prober.go:28] interesting pod/authentication-operator-5885bfd7f4-k4dfd container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" start-of-body= Mar 19 09:21:57.168193 master-0 kubenswrapper[7457]: I0319 09:21:57.168184 7457 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" podUID="e7fae040-28fa-4d97-8482-fd0dd12cc921" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" Mar 19 09:21:57.815220 master-0 kubenswrapper[7457]: I0319 09:21:57.815121 7457 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:21:58.229395 master-0 kubenswrapper[7457]: E0319 09:21:58.229239 7457 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:02.287211 master-0 kubenswrapper[7457]: I0319 09:22:02.287104 7457 generic.go:334] "Generic (PLEG): container finished" podID="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" containerID="dbe5b6ac78d411669d4c2885f202f3dc2681af9deb4ef2161f47be9747a76bd6" exitCode=0 Mar 19 09:22:03.226214 master-0 kubenswrapper[7457]: E0319 09:22:03.226151 7457 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 09:22:04.299027 master-0 kubenswrapper[7457]: I0319 09:22:04.298965 7457 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c" exitCode=0 Mar 19 09:22:04.429135 master-0 kubenswrapper[7457]: E0319 09:22:04.428804 7457 controller.go:195] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io master-0)" Mar 19 09:22:07.168907 master-0 kubenswrapper[7457]: I0319 09:22:07.168779 7457 patch_prober.go:28] interesting pod/authentication-operator-5885bfd7f4-k4dfd container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" start-of-body= Mar 19 09:22:07.168907 master-0 kubenswrapper[7457]: I0319 09:22:07.168872 7457 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" podUID="e7fae040-28fa-4d97-8482-fd0dd12cc921" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" Mar 19 09:22:07.815765 master-0 kubenswrapper[7457]: I0319 09:22:07.815381 7457 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:08.229939 master-0 kubenswrapper[7457]: E0319 09:22:08.229747 7457 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:12.342213 master-0 kubenswrapper[7457]: I0319 09:22:12.342120 7457 generic.go:334] "Generic (PLEG): container finished" podID="d664acc4-ec4f-4078-ae93-404a14ea18fc" containerID="f068dc00867ec832963c43c66c2b3ba5e5c27207844ca25057536cc59dfa3810" exitCode=0 Mar 19 09:22:14.430679 master-0 kubenswrapper[7457]: E0319 09:22:14.430201 7457 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:14.430679 master-0 kubenswrapper[7457]: I0319 09:22:14.430674 7457 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 09:22:15.354787 master-0 kubenswrapper[7457]: I0319 09:22:15.354689 7457 generic.go:334] "Generic (PLEG): container finished" podID="1694c93a-9acb-4bec-bfd6-3ec370e7a0b4" containerID="33c416f3ddb853fb82ea998149e13a2a8f2bd563b1774b31ddf6b2c491ae3aa9" exitCode=0 Mar 19 09:22:15.355914 master-0 kubenswrapper[7457]: I0319 09:22:15.355890 7457 generic.go:334] "Generic (PLEG): container finished" podID="3b333a1e-2a7f-423a-8b40-99f30c89f740" containerID="e7857b0cae9f1e592c846367f20964b7bdba92f2c028bce9260e23037d2618d9" exitCode=0 Mar 19 09:22:17.167879 master-0 kubenswrapper[7457]: I0319 09:22:17.167798 7457 patch_prober.go:28] interesting pod/authentication-operator-5885bfd7f4-k4dfd container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" start-of-body= Mar 19 09:22:17.167879 master-0 kubenswrapper[7457]: I0319 09:22:17.167860 7457 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" podUID="e7fae040-28fa-4d97-8482-fd0dd12cc921" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" Mar 19 09:22:17.305084 master-0 kubenswrapper[7457]: E0319 09:22:17.305014 7457 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 09:22:18.230621 master-0 kubenswrapper[7457]: E0319 09:22:18.230498 7457 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:18.230621 master-0 kubenswrapper[7457]: E0319 09:22:18.230544 7457 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:22:19.383809 master-0 kubenswrapper[7457]: I0319 09:22:19.383398 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-slmgx_58ea8fcc-29b2-48ef-8629-2ba217c9d70c/approver/0.log" Mar 19 09:22:19.384576 master-0 kubenswrapper[7457]: I0319 09:22:19.384191 7457 generic.go:334] "Generic (PLEG): container finished" podID="58ea8fcc-29b2-48ef-8629-2ba217c9d70c" containerID="16dabbfac23a88b18e7a1e5f639f318226358e768cd4e0f4bf6b8327e7b845c9" exitCode=1 Mar 19 09:22:22.399423 master-0 kubenswrapper[7457]: I0319 09:22:22.399348 7457 generic.go:334] "Generic (PLEG): container finished" podID="e7fae040-28fa-4d97-8482-fd0dd12cc921" containerID="7899eaeea83e799e75607f310011944713a832305f4796c7131bde2f6c40224c" exitCode=0 Mar 19 09:22:24.432029 master-0 kubenswrapper[7457]: E0319 09:22:24.431902 7457 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 19 09:22:25.417982 master-0 kubenswrapper[7457]: I0319 09:22:25.417621 7457 generic.go:334] "Generic (PLEG): container finished" podID="f0c75102-6790-4ed3-84da-61c3611186f8" containerID="46cd0596efe1a555d079c79fdb72a64ad03bb94cd6e0d19c502033e4b3f35b63" exitCode=0 Mar 19 09:22:26.347681 master-0 kubenswrapper[7457]: E0319 09:22:26.347612 7457 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:22:26.348591 master-0 kubenswrapper[7457]: E0319 09:22:26.347901 7457 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.013s" Mar 19 09:22:26.348591 master-0 kubenswrapper[7457]: I0319 09:22:26.347951 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:22:26.348591 master-0 kubenswrapper[7457]: I0319 09:22:26.347970 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:22:26.348591 master-0 kubenswrapper[7457]: I0319 09:22:26.348013 7457 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:22:26.348591 master-0 kubenswrapper[7457]: I0319 09:22:26.348026 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"434aabfa-50db-407e-92d3-a034696613e3","Type":"ContainerDied","Data":"5a3b40e5aadf949e686ac4f447f2417ca9edf3ac74f9cc8e180b0ad3fbdc1cbc"} Mar 19 09:22:26.349585 master-0 kubenswrapper[7457]: I0319 09:22:26.349491 7457 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"b5a024c432c7340543e69f0bd7bb2379e363c0a3445c80c57fd287fd74ddf6ae"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 19 09:22:26.349585 master-0 kubenswrapper[7457]: I0319 09:22:26.349577 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" containerID="cri-o://b5a024c432c7340543e69f0bd7bb2379e363c0a3445c80c57fd287fd74ddf6ae" gracePeriod=30 Mar 19 09:22:26.350439 master-0 kubenswrapper[7457]: I0319 09:22:26.350319 7457 scope.go:117] "RemoveContainer" containerID="7899eaeea83e799e75607f310011944713a832305f4796c7131bde2f6c40224c" Mar 19 09:22:26.359845 master-0 kubenswrapper[7457]: I0319 09:22:26.359492 7457 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 09:22:26.735288 master-0 kubenswrapper[7457]: I0319 09:22:26.735238 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_434aabfa-50db-407e-92d3-a034696613e3/installer/0.log" Mar 19 09:22:26.735453 master-0 kubenswrapper[7457]: I0319 09:22:26.735309 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:22:26.795725 master-0 kubenswrapper[7457]: I0319 09:22:26.795646 7457 status_manager.go:851] "Failed to get status for pod" podUID="7fcfef1e-0652-4c52-a0a8-dfbf15408d03" pod="openshift-kube-scheduler/installer-3-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-3-master-0)" Mar 19 09:22:26.880862 master-0 kubenswrapper[7457]: I0319 09:22:26.880756 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/434aabfa-50db-407e-92d3-a034696613e3-kubelet-dir\") pod \"434aabfa-50db-407e-92d3-a034696613e3\" (UID: \"434aabfa-50db-407e-92d3-a034696613e3\") " Mar 19 09:22:26.881058 master-0 kubenswrapper[7457]: I0319 09:22:26.881045 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/434aabfa-50db-407e-92d3-a034696613e3-var-lock\") pod \"434aabfa-50db-407e-92d3-a034696613e3\" (UID: \"434aabfa-50db-407e-92d3-a034696613e3\") " Mar 19 09:22:26.881174 master-0 kubenswrapper[7457]: I0319 09:22:26.881162 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/434aabfa-50db-407e-92d3-a034696613e3-kube-api-access\") pod \"434aabfa-50db-407e-92d3-a034696613e3\" (UID: \"434aabfa-50db-407e-92d3-a034696613e3\") " Mar 19 09:22:26.881315 master-0 kubenswrapper[7457]: I0319 09:22:26.880836 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/434aabfa-50db-407e-92d3-a034696613e3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "434aabfa-50db-407e-92d3-a034696613e3" (UID: "434aabfa-50db-407e-92d3-a034696613e3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:22:26.881356 master-0 kubenswrapper[7457]: I0319 09:22:26.881071 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/434aabfa-50db-407e-92d3-a034696613e3-var-lock" (OuterVolumeSpecName: "var-lock") pod "434aabfa-50db-407e-92d3-a034696613e3" (UID: "434aabfa-50db-407e-92d3-a034696613e3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:22:26.881486 master-0 kubenswrapper[7457]: I0319 09:22:26.881473 7457 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/434aabfa-50db-407e-92d3-a034696613e3-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:22:26.881568 master-0 kubenswrapper[7457]: I0319 09:22:26.881556 7457 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/434aabfa-50db-407e-92d3-a034696613e3-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:22:26.887343 master-0 kubenswrapper[7457]: I0319 09:22:26.886264 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/434aabfa-50db-407e-92d3-a034696613e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "434aabfa-50db-407e-92d3-a034696613e3" (UID: "434aabfa-50db-407e-92d3-a034696613e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:22:26.982478 master-0 kubenswrapper[7457]: I0319 09:22:26.982411 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/434aabfa-50db-407e-92d3-a034696613e3-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:22:27.229163 master-0 kubenswrapper[7457]: E0319 09:22:27.229120 7457 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 09:22:27.229163 master-0 kubenswrapper[7457]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-1-master-0_openshift-kube-controller-manager_43ca4232-9e9c-4b97-9c29-bead80a9a5fa_0(a94796e1c6f1609f876a988c98c2b0cf36f17f49daef64643652d275a28601a5): error adding pod openshift-kube-controller-manager_installer-1-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a94796e1c6f1609f876a988c98c2b0cf36f17f49daef64643652d275a28601a5" Netns:"/var/run/netns/77e126b6-96d1-4e7d-b76a-5042e6f021f0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-1-master-0;K8S_POD_INFRA_CONTAINER_ID=a94796e1c6f1609f876a988c98c2b0cf36f17f49daef64643652d275a28601a5;K8S_POD_UID=43ca4232-9e9c-4b97-9c29-bead80a9a5fa" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-1-master-0] networking: Multus: [openshift-kube-controller-manager/installer-1-master-0/43ca4232-9e9c-4b97-9c29-bead80a9a5fa]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-1-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-1-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-1-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:22:27.229163 master-0 kubenswrapper[7457]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:22:27.229163 master-0 kubenswrapper[7457]: > Mar 19 09:22:27.229418 master-0 kubenswrapper[7457]: E0319 09:22:27.229191 7457 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 09:22:27.229418 master-0 kubenswrapper[7457]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-1-master-0_openshift-kube-controller-manager_43ca4232-9e9c-4b97-9c29-bead80a9a5fa_0(a94796e1c6f1609f876a988c98c2b0cf36f17f49daef64643652d275a28601a5): error adding pod openshift-kube-controller-manager_installer-1-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a94796e1c6f1609f876a988c98c2b0cf36f17f49daef64643652d275a28601a5" Netns:"/var/run/netns/77e126b6-96d1-4e7d-b76a-5042e6f021f0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-1-master-0;K8S_POD_INFRA_CONTAINER_ID=a94796e1c6f1609f876a988c98c2b0cf36f17f49daef64643652d275a28601a5;K8S_POD_UID=43ca4232-9e9c-4b97-9c29-bead80a9a5fa" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-1-master-0] networking: Multus: [openshift-kube-controller-manager/installer-1-master-0/43ca4232-9e9c-4b97-9c29-bead80a9a5fa]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-1-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-1-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-1-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:22:27.229418 master-0 kubenswrapper[7457]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:22:27.229418 master-0 kubenswrapper[7457]: > pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:22:27.229418 master-0 kubenswrapper[7457]: E0319 09:22:27.229212 7457 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 09:22:27.229418 master-0 kubenswrapper[7457]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-1-master-0_openshift-kube-controller-manager_43ca4232-9e9c-4b97-9c29-bead80a9a5fa_0(a94796e1c6f1609f876a988c98c2b0cf36f17f49daef64643652d275a28601a5): error adding pod openshift-kube-controller-manager_installer-1-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a94796e1c6f1609f876a988c98c2b0cf36f17f49daef64643652d275a28601a5" Netns:"/var/run/netns/77e126b6-96d1-4e7d-b76a-5042e6f021f0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-1-master-0;K8S_POD_INFRA_CONTAINER_ID=a94796e1c6f1609f876a988c98c2b0cf36f17f49daef64643652d275a28601a5;K8S_POD_UID=43ca4232-9e9c-4b97-9c29-bead80a9a5fa" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-1-master-0] networking: Multus: [openshift-kube-controller-manager/installer-1-master-0/43ca4232-9e9c-4b97-9c29-bead80a9a5fa]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-1-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-1-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-1-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:22:27.229418 master-0 kubenswrapper[7457]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:22:27.229418 master-0 kubenswrapper[7457]: > pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:22:27.229418 master-0 kubenswrapper[7457]: E0319 09:22:27.229265 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-1-master-0_openshift-kube-controller-manager(43ca4232-9e9c-4b97-9c29-bead80a9a5fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-1-master-0_openshift-kube-controller-manager(43ca4232-9e9c-4b97-9c29-bead80a9a5fa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-1-master-0_openshift-kube-controller-manager_43ca4232-9e9c-4b97-9c29-bead80a9a5fa_0(a94796e1c6f1609f876a988c98c2b0cf36f17f49daef64643652d275a28601a5): error adding pod openshift-kube-controller-manager_installer-1-master-0 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a94796e1c6f1609f876a988c98c2b0cf36f17f49daef64643652d275a28601a5\\\" Netns:\\\"/var/run/netns/77e126b6-96d1-4e7d-b76a-5042e6f021f0\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-1-master-0;K8S_POD_INFRA_CONTAINER_ID=a94796e1c6f1609f876a988c98c2b0cf36f17f49daef64643652d275a28601a5;K8S_POD_UID=43ca4232-9e9c-4b97-9c29-bead80a9a5fa\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-1-master-0] networking: Multus: [openshift-kube-controller-manager/installer-1-master-0/43ca4232-9e9c-4b97-9c29-bead80a9a5fa]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-1-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-1-master-0 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-1-master-0?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="43ca4232-9e9c-4b97-9c29-bead80a9a5fa" Mar 19 09:22:27.294695 master-0 kubenswrapper[7457]: E0319 09:22:27.294589 7457 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 09:22:27.294695 master-0 kubenswrapper[7457]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-4-master-0_openshift-kube-scheduler_014ef8bd-b940-41e2-9239-c238afe6ebae_0(9cee68768375e0f2851ffa4fd735bdb6a39ab28f33dec99e6dccba6908aea5f8): error adding pod openshift-kube-scheduler_installer-4-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9cee68768375e0f2851ffa4fd735bdb6a39ab28f33dec99e6dccba6908aea5f8" Netns:"/var/run/netns/15a0295a-886c-437f-a9e1-5f55fff849fe" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-scheduler;K8S_POD_NAME=installer-4-master-0;K8S_POD_INFRA_CONTAINER_ID=9cee68768375e0f2851ffa4fd735bdb6a39ab28f33dec99e6dccba6908aea5f8;K8S_POD_UID=014ef8bd-b940-41e2-9239-c238afe6ebae" Path:"" ERRORED: error configuring pod [openshift-kube-scheduler/installer-4-master-0] networking: Multus: [openshift-kube-scheduler/installer-4-master-0/014ef8bd-b940-41e2-9239-c238afe6ebae]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-4-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-4-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-4-master-0?timeout=1m0s": context deadline exceeded Mar 19 09:22:27.294695 master-0 kubenswrapper[7457]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:22:27.294695 master-0 kubenswrapper[7457]: > Mar 19 09:22:27.294695 master-0 kubenswrapper[7457]: E0319 09:22:27.294658 7457 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 09:22:27.294695 master-0 kubenswrapper[7457]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-4-master-0_openshift-kube-scheduler_014ef8bd-b940-41e2-9239-c238afe6ebae_0(9cee68768375e0f2851ffa4fd735bdb6a39ab28f33dec99e6dccba6908aea5f8): error adding pod openshift-kube-scheduler_installer-4-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9cee68768375e0f2851ffa4fd735bdb6a39ab28f33dec99e6dccba6908aea5f8" Netns:"/var/run/netns/15a0295a-886c-437f-a9e1-5f55fff849fe" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-scheduler;K8S_POD_NAME=installer-4-master-0;K8S_POD_INFRA_CONTAINER_ID=9cee68768375e0f2851ffa4fd735bdb6a39ab28f33dec99e6dccba6908aea5f8;K8S_POD_UID=014ef8bd-b940-41e2-9239-c238afe6ebae" Path:"" ERRORED: error configuring pod [openshift-kube-scheduler/installer-4-master-0] networking: Multus: [openshift-kube-scheduler/installer-4-master-0/014ef8bd-b940-41e2-9239-c238afe6ebae]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-4-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-4-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-4-master-0?timeout=1m0s": context deadline exceeded Mar 19 09:22:27.294695 master-0 kubenswrapper[7457]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:22:27.294695 master-0 kubenswrapper[7457]: > pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:22:27.294695 master-0 kubenswrapper[7457]: E0319 09:22:27.294680 7457 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 09:22:27.294695 master-0 kubenswrapper[7457]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-4-master-0_openshift-kube-scheduler_014ef8bd-b940-41e2-9239-c238afe6ebae_0(9cee68768375e0f2851ffa4fd735bdb6a39ab28f33dec99e6dccba6908aea5f8): error adding pod openshift-kube-scheduler_installer-4-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9cee68768375e0f2851ffa4fd735bdb6a39ab28f33dec99e6dccba6908aea5f8" Netns:"/var/run/netns/15a0295a-886c-437f-a9e1-5f55fff849fe" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-scheduler;K8S_POD_NAME=installer-4-master-0;K8S_POD_INFRA_CONTAINER_ID=9cee68768375e0f2851ffa4fd735bdb6a39ab28f33dec99e6dccba6908aea5f8;K8S_POD_UID=014ef8bd-b940-41e2-9239-c238afe6ebae" Path:"" ERRORED: error configuring pod [openshift-kube-scheduler/installer-4-master-0] networking: Multus: [openshift-kube-scheduler/installer-4-master-0/014ef8bd-b940-41e2-9239-c238afe6ebae]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-4-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-4-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-4-master-0?timeout=1m0s": context deadline exceeded Mar 19 09:22:27.294695 master-0 kubenswrapper[7457]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:22:27.294695 master-0 kubenswrapper[7457]: > pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:22:27.296155 master-0 kubenswrapper[7457]: E0319 09:22:27.294756 7457 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-4-master-0_openshift-kube-scheduler(014ef8bd-b940-41e2-9239-c238afe6ebae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-4-master-0_openshift-kube-scheduler(014ef8bd-b940-41e2-9239-c238afe6ebae)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-4-master-0_openshift-kube-scheduler_014ef8bd-b940-41e2-9239-c238afe6ebae_0(9cee68768375e0f2851ffa4fd735bdb6a39ab28f33dec99e6dccba6908aea5f8): error adding pod openshift-kube-scheduler_installer-4-master-0 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"9cee68768375e0f2851ffa4fd735bdb6a39ab28f33dec99e6dccba6908aea5f8\\\" Netns:\\\"/var/run/netns/15a0295a-886c-437f-a9e1-5f55fff849fe\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-scheduler;K8S_POD_NAME=installer-4-master-0;K8S_POD_INFRA_CONTAINER_ID=9cee68768375e0f2851ffa4fd735bdb6a39ab28f33dec99e6dccba6908aea5f8;K8S_POD_UID=014ef8bd-b940-41e2-9239-c238afe6ebae\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-kube-scheduler/installer-4-master-0] networking: Multus: [openshift-kube-scheduler/installer-4-master-0/014ef8bd-b940-41e2-9239-c238afe6ebae]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-4-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-4-master-0 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-4-master-0?timeout=1m0s\\\": context deadline exceeded\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-scheduler/installer-4-master-0" podUID="014ef8bd-b940-41e2-9239-c238afe6ebae" Mar 19 09:22:27.429766 master-0 kubenswrapper[7457]: I0319 09:22:27.429731 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_434aabfa-50db-407e-92d3-a034696613e3/installer/0.log" Mar 19 09:22:27.430898 master-0 kubenswrapper[7457]: I0319 09:22:27.429828 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:22:27.435866 master-0 kubenswrapper[7457]: I0319 09:22:27.435835 7457 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="b5a024c432c7340543e69f0bd7bb2379e363c0a3445c80c57fd287fd74ddf6ae" exitCode=2 Mar 19 09:22:27.437642 master-0 kubenswrapper[7457]: I0319 09:22:27.437571 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-68f85b4d6c-jg9m5_259794ab-d027-497a-b08e-5a6d79057668/catalog-operator/0.log" Mar 19 09:22:27.437821 master-0 kubenswrapper[7457]: I0319 09:22:27.437614 7457 generic.go:334] "Generic (PLEG): container finished" podID="259794ab-d027-497a-b08e-5a6d79057668" containerID="0bb6d4411c90b21c40d2ebf35d55a831d972d567e97bde63d3acc0f2997756c7" exitCode=1 Mar 19 09:22:27.437986 master-0 kubenswrapper[7457]: I0319 09:22:27.437920 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:22:27.438999 master-0 kubenswrapper[7457]: I0319 09:22:27.438749 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:22:27.439897 master-0 kubenswrapper[7457]: I0319 09:22:27.439548 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:22:27.439897 master-0 kubenswrapper[7457]: I0319 09:22:27.439765 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:22:28.443760 master-0 kubenswrapper[7457]: I0319 09:22:28.443702 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-5c9796789-rh692_7b29cb7b-26d2-4fab-9e03-2d7fdf937592/olm-operator/0.log" Mar 19 09:22:28.443760 master-0 kubenswrapper[7457]: I0319 09:22:28.443742 7457 generic.go:334] "Generic (PLEG): container finished" podID="7b29cb7b-26d2-4fab-9e03-2d7fdf937592" containerID="2aae1324ea9ac71e757c6b6742bbfe17bf26ff22a4f1597837954f981813c18e" exitCode=1 Mar 19 09:22:29.484276 master-0 kubenswrapper[7457]: I0319 09:22:29.484140 7457 patch_prober.go:28] interesting pod/catalog-operator-68f85b4d6c-jg9m5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 19 09:22:29.484276 master-0 kubenswrapper[7457]: I0319 09:22:29.484141 7457 patch_prober.go:28] interesting pod/catalog-operator-68f85b4d6c-jg9m5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 19 09:22:29.484276 master-0 kubenswrapper[7457]: I0319 09:22:29.484202 7457 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" podUID="259794ab-d027-497a-b08e-5a6d79057668" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 19 09:22:29.484276 master-0 kubenswrapper[7457]: I0319 09:22:29.484204 7457 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" podUID="259794ab-d027-497a-b08e-5a6d79057668" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 19 09:22:29.491149 master-0 kubenswrapper[7457]: I0319 09:22:29.490922 7457 patch_prober.go:28] interesting pod/olm-operator-5c9796789-rh692 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.128.0.24:8443/healthz\": dial tcp 10.128.0.24:8443: connect: connection refused" start-of-body= Mar 19 09:22:29.491149 master-0 kubenswrapper[7457]: I0319 09:22:29.490965 7457 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" podUID="7b29cb7b-26d2-4fab-9e03-2d7fdf937592" containerName="olm-operator" probeResult="failure" output="Get \"https://10.128.0.24:8443/healthz\": dial tcp 10.128.0.24:8443: connect: connection refused" Mar 19 09:22:29.491149 master-0 kubenswrapper[7457]: I0319 09:22:29.491020 7457 patch_prober.go:28] interesting pod/olm-operator-5c9796789-rh692 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.24:8443/healthz\": dial tcp 10.128.0.24:8443: connect: connection refused" start-of-body= Mar 19 09:22:29.491149 master-0 kubenswrapper[7457]: I0319 09:22:29.491064 7457 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" podUID="7b29cb7b-26d2-4fab-9e03-2d7fdf937592" containerName="olm-operator" probeResult="failure" output="Get \"https://10.128.0.24:8443/healthz\": dial tcp 10.128.0.24:8443: connect: connection refused" Mar 19 09:22:29.951753 master-0 kubenswrapper[7457]: E0319 09:22:29.951601 7457 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Mar 19 09:22:29.951753 master-0 kubenswrapper[7457]: &Event{ObjectMeta:{controller-manager-6f89774b7d-nrm4r.189e33a3e8e9f690 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-6f89774b7d-nrm4r,UID:c76b3023-dcc2-4ea3-b590-bf7fd718fc3f,APIVersion:v1,ResourceVersion:6373,FieldPath:spec.containers{controller-manager},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.128.0.40:8443/healthz": dial tcp 10.128.0.40:8443: connect: connection refused Mar 19 09:22:29.951753 master-0 kubenswrapper[7457]: body: Mar 19 09:22:29.951753 master-0 kubenswrapper[7457]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:21:22.310887056 +0000 UTC m=+78.166226466,LastTimestamp:2026-03-19 09:21:22.310887056 +0000 UTC m=+78.166226466,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 19 09:22:29.951753 master-0 kubenswrapper[7457]: > Mar 19 09:22:31.458959 master-0 kubenswrapper[7457]: I0319 09:22:31.458635 7457 generic.go:334] "Generic (PLEG): container finished" podID="43cb2a3b-40e2-45ee-894a-6c833ee17efd" containerID="c4276c1e12973c262c98545548719e35835681298a10338c9d6009cc8f7eb867" exitCode=0 Mar 19 09:22:32.131366 master-0 kubenswrapper[7457]: E0319 09:22:32.131320 7457 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="5.783s" Mar 19 09:22:32.131576 master-0 kubenswrapper[7457]: I0319 09:22:32.131456 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" event={"ID":"a1098584-43b9-4f2c-83d2-22d95fb7b0c3","Type":"ContainerDied","Data":"dbe5b6ac78d411669d4c2885f202f3dc2681af9deb4ef2161f47be9747a76bd6"} Mar 19 09:22:32.133922 master-0 kubenswrapper[7457]: I0319 09:22:32.133887 7457 scope.go:117] "RemoveContainer" containerID="dbe5b6ac78d411669d4c2885f202f3dc2681af9deb4ef2161f47be9747a76bd6" Mar 19 09:22:32.134002 master-0 kubenswrapper[7457]: I0319 09:22:32.133976 7457 scope.go:117] "RemoveContainer" containerID="16dabbfac23a88b18e7a1e5f639f318226358e768cd4e0f4bf6b8327e7b845c9" Mar 19 09:22:32.142408 master-0 kubenswrapper[7457]: I0319 09:22:32.142373 7457 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 09:22:32.148874 master-0 kubenswrapper[7457]: I0319 09:22:32.148836 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c"} Mar 19 09:22:32.148983 master-0 kubenswrapper[7457]: I0319 09:22:32.148895 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" event={"ID":"d664acc4-ec4f-4078-ae93-404a14ea18fc","Type":"ContainerDied","Data":"f068dc00867ec832963c43c66c2b3ba5e5c27207844ca25057536cc59dfa3810"} Mar 19 09:22:32.148983 master-0 kubenswrapper[7457]: I0319 09:22:32.148913 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" event={"ID":"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4","Type":"ContainerDied","Data":"33c416f3ddb853fb82ea998149e13a2a8f2bd563b1774b31ddf6b2c491ae3aa9"} Mar 19 09:22:32.148983 master-0 kubenswrapper[7457]: I0319 09:22:32.148928 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" event={"ID":"3b333a1e-2a7f-423a-8b40-99f30c89f740","Type":"ContainerDied","Data":"e7857b0cae9f1e592c846367f20964b7bdba92f2c028bce9260e23037d2618d9"} Mar 19 09:22:32.148983 master-0 kubenswrapper[7457]: I0319 09:22:32.148945 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492"} Mar 19 09:22:32.148983 master-0 kubenswrapper[7457]: I0319 09:22:32.148957 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6"} Mar 19 09:22:32.149139 master-0 kubenswrapper[7457]: I0319 09:22:32.149100 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23"} Mar 19 09:22:32.149176 master-0 kubenswrapper[7457]: I0319 09:22:32.149151 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb"} Mar 19 09:22:32.149176 master-0 kubenswrapper[7457]: I0319 09:22:32.149171 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51"} Mar 19 09:22:32.149233 master-0 kubenswrapper[7457]: I0319 09:22:32.149185 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-slmgx" event={"ID":"58ea8fcc-29b2-48ef-8629-2ba217c9d70c","Type":"ContainerDied","Data":"16dabbfac23a88b18e7a1e5f639f318226358e768cd4e0f4bf6b8327e7b845c9"} Mar 19 09:22:32.149233 master-0 kubenswrapper[7457]: I0319 09:22:32.149205 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" event={"ID":"e7fae040-28fa-4d97-8482-fd0dd12cc921","Type":"ContainerDied","Data":"7899eaeea83e799e75607f310011944713a832305f4796c7131bde2f6c40224c"} Mar 19 09:22:32.149233 master-0 kubenswrapper[7457]: I0319 09:22:32.149219 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" event={"ID":"f0c75102-6790-4ed3-84da-61c3611186f8","Type":"ContainerDied","Data":"46cd0596efe1a555d079c79fdb72a64ad03bb94cd6e0d19c502033e4b3f35b63"} Mar 19 09:22:32.149316 master-0 kubenswrapper[7457]: I0319 09:22:32.149233 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"434aabfa-50db-407e-92d3-a034696613e3","Type":"ContainerDied","Data":"9f695fbbb2ac33712845536e84cccd0ed476913549534361c504ad37ba881e39"} Mar 19 09:22:32.149316 master-0 kubenswrapper[7457]: I0319 09:22:32.149253 7457 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f695fbbb2ac33712845536e84cccd0ed476913549534361c504ad37ba881e39" Mar 19 09:22:32.149316 master-0 kubenswrapper[7457]: I0319 09:22:32.149269 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" event={"ID":"e7fae040-28fa-4d97-8482-fd0dd12cc921","Type":"ContainerStarted","Data":"f8630e30c090f2e5ba936a3f951bf3b77770a0dac127f9ddc4b2639a6297f442"} Mar 19 09:22:32.149316 master-0 kubenswrapper[7457]: I0319 09:22:32.149282 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"b5a024c432c7340543e69f0bd7bb2379e363c0a3445c80c57fd287fd74ddf6ae"} Mar 19 09:22:32.149316 master-0 kubenswrapper[7457]: I0319 09:22:32.149299 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a"} Mar 19 09:22:32.149316 master-0 kubenswrapper[7457]: I0319 09:22:32.149311 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" event={"ID":"259794ab-d027-497a-b08e-5a6d79057668","Type":"ContainerDied","Data":"0bb6d4411c90b21c40d2ebf35d55a831d972d567e97bde63d3acc0f2997756c7"} Mar 19 09:22:32.149316 master-0 kubenswrapper[7457]: I0319 09:22:32.149326 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" event={"ID":"7b29cb7b-26d2-4fab-9e03-2d7fdf937592","Type":"ContainerDied","Data":"2aae1324ea9ac71e757c6b6742bbfe17bf26ff22a4f1597837954f981813c18e"} Mar 19 09:22:32.149508 master-0 kubenswrapper[7457]: I0319 09:22:32.149340 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" event={"ID":"43cb2a3b-40e2-45ee-894a-6c833ee17efd","Type":"ContainerDied","Data":"c4276c1e12973c262c98545548719e35835681298a10338c9d6009cc8f7eb867"} Mar 19 09:22:32.149508 master-0 kubenswrapper[7457]: I0319 09:22:32.149365 7457 scope.go:117] "RemoveContainer" containerID="33c416f3ddb853fb82ea998149e13a2a8f2bd563b1774b31ddf6b2c491ae3aa9" Mar 19 09:22:32.149597 master-0 kubenswrapper[7457]: I0319 09:22:32.149552 7457 scope.go:117] "RemoveContainer" containerID="f068dc00867ec832963c43c66c2b3ba5e5c27207844ca25057536cc59dfa3810" Mar 19 09:22:32.149721 master-0 kubenswrapper[7457]: I0319 09:22:32.149640 7457 scope.go:117] "RemoveContainer" containerID="c4276c1e12973c262c98545548719e35835681298a10338c9d6009cc8f7eb867" Mar 19 09:22:32.149757 master-0 kubenswrapper[7457]: I0319 09:22:32.149702 7457 scope.go:117] "RemoveContainer" containerID="a74f0437d5a92c82edd9e58f193503c363594aaca67bff5a5ae6fcd1a5a28477" Mar 19 09:22:32.149798 master-0 kubenswrapper[7457]: I0319 09:22:32.149771 7457 scope.go:117] "RemoveContainer" containerID="e7857b0cae9f1e592c846367f20964b7bdba92f2c028bce9260e23037d2618d9" Mar 19 09:22:32.150322 master-0 kubenswrapper[7457]: I0319 09:22:32.149989 7457 scope.go:117] "RemoveContainer" containerID="46cd0596efe1a555d079c79fdb72a64ad03bb94cd6e0d19c502033e4b3f35b63" Mar 19 09:22:32.150322 master-0 kubenswrapper[7457]: I0319 09:22:32.150109 7457 scope.go:117] "RemoveContainer" containerID="0bb6d4411c90b21c40d2ebf35d55a831d972d567e97bde63d3acc0f2997756c7" Mar 19 09:22:32.150322 master-0 kubenswrapper[7457]: I0319 09:22:32.150270 7457 scope.go:117] "RemoveContainer" containerID="2aae1324ea9ac71e757c6b6742bbfe17bf26ff22a4f1597837954f981813c18e" Mar 19 09:22:32.228581 master-0 kubenswrapper[7457]: I0319 09:22:32.227997 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 09:22:32.228581 master-0 kubenswrapper[7457]: I0319 09:22:32.228080 7457 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="582a6a29-7bc2-44bc-aaa8-62338727b48b" Mar 19 09:22:32.233627 master-0 kubenswrapper[7457]: I0319 09:22:32.230324 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 19 09:22:32.252868 master-0 kubenswrapper[7457]: I0319 09:22:32.252323 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 09:22:32.267680 master-0 kubenswrapper[7457]: I0319 09:22:32.267358 7457 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 09:22:32.267680 master-0 kubenswrapper[7457]: I0319 09:22:32.267388 7457 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="582a6a29-7bc2-44bc-aaa8-62338727b48b" Mar 19 09:22:32.288208 master-0 kubenswrapper[7457]: I0319 09:22:32.288161 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:22:32.353166 master-0 kubenswrapper[7457]: I0319 09:22:32.353112 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn"] Mar 19 09:22:32.362609 master-0 kubenswrapper[7457]: I0319 09:22:32.362561 7457 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597786f6d8-qsfjn"] Mar 19 09:22:32.435287 master-0 kubenswrapper[7457]: I0319 09:22:32.435234 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 09:22:32.452183 master-0 kubenswrapper[7457]: I0319 09:22:32.451793 7457 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 09:22:32.469142 master-0 kubenswrapper[7457]: I0319 09:22:32.469114 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-5c9796789-rh692_7b29cb7b-26d2-4fab-9e03-2d7fdf937592/olm-operator/0.log" Mar 19 09:22:32.484233 master-0 kubenswrapper[7457]: I0319 09:22:32.469196 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" event={"ID":"7b29cb7b-26d2-4fab-9e03-2d7fdf937592","Type":"ContainerStarted","Data":"1fe0d8ce24995281a5bfc9ff130f8b2033f6051981e90e0690bbff8cbfd8438a"} Mar 19 09:22:32.484233 master-0 kubenswrapper[7457]: I0319 09:22:32.476849 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" event={"ID":"3b333a1e-2a7f-423a-8b40-99f30c89f740","Type":"ContainerStarted","Data":"50ade8898bd983b7011d7136db6a1b68fa8b890c20eca3736ed9402473428e6a"} Mar 19 09:22:32.484233 master-0 kubenswrapper[7457]: I0319 09:22:32.478661 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"43ca4232-9e9c-4b97-9c29-bead80a9a5fa","Type":"ContainerStarted","Data":"c82ea247d4a04551474a0bf79f03cd9f98a0925e6c68fa6c0c9d75dba8c1773c"} Mar 19 09:22:32.484233 master-0 kubenswrapper[7457]: I0319 09:22:32.482277 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-slmgx_58ea8fcc-29b2-48ef-8629-2ba217c9d70c/approver/0.log" Mar 19 09:22:32.484233 master-0 kubenswrapper[7457]: I0319 09:22:32.482850 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-slmgx" event={"ID":"58ea8fcc-29b2-48ef-8629-2ba217c9d70c","Type":"ContainerStarted","Data":"0c9bb6f28236e5e26577492918d9d691fd6d1f78a2da9cc0727e44bdd383f7c9"} Mar 19 09:22:32.486117 master-0 kubenswrapper[7457]: I0319 09:22:32.486089 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" event={"ID":"43cb2a3b-40e2-45ee-894a-6c833ee17efd","Type":"ContainerStarted","Data":"25438cc3d636933145b12f3ec48bbc5e37772724b018c9bb6359db0510fbc73f"} Mar 19 09:22:32.487851 master-0 kubenswrapper[7457]: I0319 09:22:32.487821 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"014ef8bd-b940-41e2-9239-c238afe6ebae","Type":"ContainerStarted","Data":"419f90df85200464073bb55727a37114d61c84e4d555b334b5798b07351fb1d6"} Mar 19 09:22:32.492490 master-0 kubenswrapper[7457]: I0319 09:22:32.492457 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f89774b7d-nrm4r"] Mar 19 09:22:32.505094 master-0 kubenswrapper[7457]: I0319 09:22:32.503467 7457 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f89774b7d-nrm4r"] Mar 19 09:22:32.516294 master-0 kubenswrapper[7457]: E0319 09:22:32.514372 7457 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 19 09:22:32.779758 master-0 kubenswrapper[7457]: I0319 09:22:32.779215 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 09:22:32.801543 master-0 kubenswrapper[7457]: I0319 09:22:32.801277 7457 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 09:22:33.041645 master-0 kubenswrapper[7457]: I0319 09:22:33.041584 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=1.041564647 podStartE2EDuration="1.041564647s" podCreationTimestamp="2026-03-19 09:22:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:22:33.039940408 +0000 UTC m=+148.895279798" watchObservedRunningTime="2026-03-19 09:22:33.041564647 +0000 UTC m=+148.896904017" Mar 19 09:22:33.519397 master-0 kubenswrapper[7457]: I0319 09:22:33.519338 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" event={"ID":"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4","Type":"ContainerStarted","Data":"a43f09a94daeb0b801d3bf3632a624371e16c2cae0b3998aaa8458b263be91b6"} Mar 19 09:22:33.523183 master-0 kubenswrapper[7457]: I0319 09:22:33.523145 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"43ca4232-9e9c-4b97-9c29-bead80a9a5fa","Type":"ContainerStarted","Data":"46c63e43dc61899ca4cb1732e5d7d4e693a722f5fb486db67fb30cfa5bfc8af5"} Mar 19 09:22:33.526049 master-0 kubenswrapper[7457]: I0319 09:22:33.525977 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" event={"ID":"d664acc4-ec4f-4078-ae93-404a14ea18fc","Type":"ContainerStarted","Data":"e6512654a7b81e81d3cf7f8b0a2c398a4043c013a96e8ba34bd6966d31c6e69d"} Mar 19 09:22:33.530600 master-0 kubenswrapper[7457]: I0319 09:22:33.530491 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" event={"ID":"a1098584-43b9-4f2c-83d2-22d95fb7b0c3","Type":"ContainerStarted","Data":"b384b23c103152484f8244a3fdd3b61a36a7a2a435a4c48dc2a74ee3dc591090"} Mar 19 09:22:33.535169 master-0 kubenswrapper[7457]: I0319 09:22:33.535091 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" event={"ID":"f0c75102-6790-4ed3-84da-61c3611186f8","Type":"ContainerStarted","Data":"c5855727be67ab73d998ddf3217b2c81b441e9bfbc39cb90c19d7f4d5714a6c9"} Mar 19 09:22:33.538063 master-0 kubenswrapper[7457]: I0319 09:22:33.538023 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"014ef8bd-b940-41e2-9239-c238afe6ebae","Type":"ContainerStarted","Data":"63e480bd33c67f5ddbdb4cc89c4a2b081a014d57d3304086d0ed39b6f8f0a797"} Mar 19 09:22:33.542110 master-0 kubenswrapper[7457]: I0319 09:22:33.542063 7457 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-68f85b4d6c-jg9m5_259794ab-d027-497a-b08e-5a6d79057668/catalog-operator/0.log" Mar 19 09:22:33.542639 master-0 kubenswrapper[7457]: I0319 09:22:33.542604 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" event={"ID":"259794ab-d027-497a-b08e-5a6d79057668","Type":"ContainerStarted","Data":"b6a331dcdb25701cc66a593bb70deef879a6b0b867638ca05306fac55f081838"} Mar 19 09:22:33.543547 master-0 kubenswrapper[7457]: I0319 09:22:33.543476 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:22:33.543547 master-0 kubenswrapper[7457]: I0319 09:22:33.543513 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:22:33.547183 master-0 kubenswrapper[7457]: I0319 09:22:33.547146 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:22:33.554642 master-0 kubenswrapper[7457]: I0319 09:22:33.553972 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:22:33.888086 master-0 kubenswrapper[7457]: I0319 09:22:33.887604 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=77.887576898 podStartE2EDuration="1m17.887576898s" podCreationTimestamp="2026-03-19 09:21:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:22:33.885061457 +0000 UTC m=+149.740400857" watchObservedRunningTime="2026-03-19 09:22:33.887576898 +0000 UTC m=+149.742916298" Mar 19 09:22:33.946251 master-0 kubenswrapper[7457]: I0319 09:22:33.946171 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=73.946156852 podStartE2EDuration="1m13.946156852s" podCreationTimestamp="2026-03-19 09:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:22:33.942867074 +0000 UTC m=+149.798206454" watchObservedRunningTime="2026-03-19 09:22:33.946156852 +0000 UTC m=+149.801496222" Mar 19 09:22:34.346642 master-0 kubenswrapper[7457]: I0319 09:22:34.346577 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b63f3f-ca62-4195-80e9-7ff427e1c58b" path="/var/lib/kubelet/pods/06b63f3f-ca62-4195-80e9-7ff427e1c58b/volumes" Mar 19 09:22:34.348304 master-0 kubenswrapper[7457]: I0319 09:22:34.348269 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fcfef1e-0652-4c52-a0a8-dfbf15408d03" path="/var/lib/kubelet/pods/7fcfef1e-0652-4c52-a0a8-dfbf15408d03/volumes" Mar 19 09:22:34.350021 master-0 kubenswrapper[7457]: I0319 09:22:34.349979 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3526fc7-b3af-4146-a791-fad627e8c9fa" path="/var/lib/kubelet/pods/c3526fc7-b3af-4146-a791-fad627e8c9fa/volumes" Mar 19 09:22:34.351646 master-0 kubenswrapper[7457]: I0319 09:22:34.351602 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c76b3023-dcc2-4ea3-b590-bf7fd718fc3f" path="/var/lib/kubelet/pods/c76b3023-dcc2-4ea3-b590-bf7fd718fc3f/volumes" Mar 19 09:22:34.815202 master-0 kubenswrapper[7457]: I0319 09:22:34.815156 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:22:34.975912 master-0 kubenswrapper[7457]: I0319 09:22:34.975854 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 19 09:22:34.976144 master-0 kubenswrapper[7457]: I0319 09:22:34.975945 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 19 09:22:35.009543 master-0 kubenswrapper[7457]: I0319 09:22:35.009484 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 19 09:22:35.568834 master-0 kubenswrapper[7457]: I0319 09:22:35.568621 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 19 09:22:37.816213 master-0 kubenswrapper[7457]: I0319 09:22:37.815889 7457 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:41.952108 master-0 kubenswrapper[7457]: I0319 09:22:41.952058 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:22:47.816141 master-0 kubenswrapper[7457]: I0319 09:22:47.815641 7457 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:48.021911 master-0 kubenswrapper[7457]: I0319 09:22:48.021837 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 09:22:48.022146 master-0 kubenswrapper[7457]: E0319 09:22:48.022070 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434aabfa-50db-407e-92d3-a034696613e3" containerName="installer" Mar 19 09:22:48.022146 master-0 kubenswrapper[7457]: I0319 09:22:48.022088 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="434aabfa-50db-407e-92d3-a034696613e3" containerName="installer" Mar 19 09:22:48.022146 master-0 kubenswrapper[7457]: E0319 09:22:48.022110 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76b3023-dcc2-4ea3-b590-bf7fd718fc3f" containerName="controller-manager" Mar 19 09:22:48.022146 master-0 kubenswrapper[7457]: I0319 09:22:48.022118 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76b3023-dcc2-4ea3-b590-bf7fd718fc3f" containerName="controller-manager" Mar 19 09:22:48.022146 master-0 kubenswrapper[7457]: E0319 09:22:48.022132 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3526fc7-b3af-4146-a791-fad627e8c9fa" containerName="route-controller-manager" Mar 19 09:22:48.022146 master-0 kubenswrapper[7457]: I0319 09:22:48.022143 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3526fc7-b3af-4146-a791-fad627e8c9fa" containerName="route-controller-manager" Mar 19 09:22:48.022716 master-0 kubenswrapper[7457]: E0319 09:22:48.022156 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fcfef1e-0652-4c52-a0a8-dfbf15408d03" containerName="installer" Mar 19 09:22:48.022716 master-0 kubenswrapper[7457]: I0319 09:22:48.022167 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fcfef1e-0652-4c52-a0a8-dfbf15408d03" containerName="installer" Mar 19 09:22:48.022716 master-0 kubenswrapper[7457]: E0319 09:22:48.022180 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a" containerName="installer" Mar 19 09:22:48.022716 master-0 kubenswrapper[7457]: I0319 09:22:48.022190 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a" containerName="installer" Mar 19 09:22:48.022716 master-0 kubenswrapper[7457]: E0319 09:22:48.022202 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b63f3f-ca62-4195-80e9-7ff427e1c58b" containerName="installer" Mar 19 09:22:48.022716 master-0 kubenswrapper[7457]: I0319 09:22:48.022211 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b63f3f-ca62-4195-80e9-7ff427e1c58b" containerName="installer" Mar 19 09:22:48.022716 master-0 kubenswrapper[7457]: I0319 09:22:48.022504 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="c76b3023-dcc2-4ea3-b590-bf7fd718fc3f" containerName="controller-manager" Mar 19 09:22:48.022716 master-0 kubenswrapper[7457]: I0319 09:22:48.022542 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fcfef1e-0652-4c52-a0a8-dfbf15408d03" containerName="installer" Mar 19 09:22:48.022716 master-0 kubenswrapper[7457]: I0319 09:22:48.022561 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a" containerName="installer" Mar 19 09:22:48.022716 master-0 kubenswrapper[7457]: I0319 09:22:48.022575 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3526fc7-b3af-4146-a791-fad627e8c9fa" containerName="route-controller-manager" Mar 19 09:22:48.022716 master-0 kubenswrapper[7457]: I0319 09:22:48.022589 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b63f3f-ca62-4195-80e9-7ff427e1c58b" containerName="installer" Mar 19 09:22:48.022716 master-0 kubenswrapper[7457]: I0319 09:22:48.022602 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="434aabfa-50db-407e-92d3-a034696613e3" containerName="installer" Mar 19 09:22:48.023461 master-0 kubenswrapper[7457]: I0319 09:22:48.022954 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:22:48.025376 master-0 kubenswrapper[7457]: I0319 09:22:48.025334 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-qzwhq" Mar 19 09:22:48.026217 master-0 kubenswrapper[7457]: I0319 09:22:48.026182 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 09:22:48.035878 master-0 kubenswrapper[7457]: I0319 09:22:48.035823 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 09:22:48.127504 master-0 kubenswrapper[7457]: I0319 09:22:48.127335 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:22:48.127504 master-0 kubenswrapper[7457]: I0319 09:22:48.127457 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:22:48.127819 master-0 kubenswrapper[7457]: I0319 09:22:48.127675 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:22:48.229346 master-0 kubenswrapper[7457]: I0319 09:22:48.229292 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:22:48.229840 master-0 kubenswrapper[7457]: I0319 09:22:48.229800 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:22:48.230071 master-0 kubenswrapper[7457]: I0319 09:22:48.230041 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:22:48.230295 master-0 kubenswrapper[7457]: I0319 09:22:48.229458 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:22:48.230569 master-0 kubenswrapper[7457]: I0319 09:22:48.230163 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:22:48.266610 master-0 kubenswrapper[7457]: I0319 09:22:48.266475 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:22:48.348866 master-0 kubenswrapper[7457]: I0319 09:22:48.348785 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:22:48.563157 master-0 kubenswrapper[7457]: I0319 09:22:48.563067 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 09:22:48.569455 master-0 kubenswrapper[7457]: W0319 09:22:48.569394 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podff98fb1e_7a1f_4657_b085_743d6f2d28e2.slice/crio-f2b06f4e36c66727358cf033020ec300296b581135453e3576489d12e345e41e WatchSource:0}: Error finding container f2b06f4e36c66727358cf033020ec300296b581135453e3576489d12e345e41e: Status 404 returned error can't find the container with id f2b06f4e36c66727358cf033020ec300296b581135453e3576489d12e345e41e Mar 19 09:22:48.619105 master-0 kubenswrapper[7457]: I0319 09:22:48.618760 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"ff98fb1e-7a1f-4657-b085-743d6f2d28e2","Type":"ContainerStarted","Data":"f2b06f4e36c66727358cf033020ec300296b581135453e3576489d12e345e41e"} Mar 19 09:22:49.334052 master-0 kubenswrapper[7457]: I0319 09:22:49.333963 7457 patch_prober.go:28] interesting pod/etcd-operator-8544cbcf9c-5bddk container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.11:8443/healthz\": net/http: TLS handshake timeout" start-of-body= Mar 19 09:22:49.335125 master-0 kubenswrapper[7457]: I0319 09:22:49.334056 7457 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" podUID="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.11:8443/healthz\": net/http: TLS handshake timeout" Mar 19 09:22:49.627152 master-0 kubenswrapper[7457]: I0319 09:22:49.626990 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"ff98fb1e-7a1f-4657-b085-743d6f2d28e2","Type":"ContainerStarted","Data":"f2f4573ac6359250badfecb43e628f11a57ba451127ad683fe2723ca4c3b389c"} Mar 19 09:22:49.905663 master-0 kubenswrapper[7457]: I0319 09:22:49.905436 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podStartSLOduration=1.905382238 podStartE2EDuration="1.905382238s" podCreationTimestamp="2026-03-19 09:22:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:22:49.903223997 +0000 UTC m=+165.758563377" watchObservedRunningTime="2026-03-19 09:22:49.905382238 +0000 UTC m=+165.760721638" Mar 19 09:22:56.313714 master-0 kubenswrapper[7457]: I0319 09:22:56.313647 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c8fd866bf-g46sj"] Mar 19 09:22:56.314655 master-0 kubenswrapper[7457]: I0319 09:22:56.314512 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.317012 master-0 kubenswrapper[7457]: I0319 09:22:56.316976 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr"] Mar 19 09:22:56.317278 master-0 kubenswrapper[7457]: I0319 09:22:56.317257 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:22:56.317528 master-0 kubenswrapper[7457]: I0319 09:22:56.317467 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.318356 master-0 kubenswrapper[7457]: I0319 09:22:56.318309 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:22:56.318793 master-0 kubenswrapper[7457]: I0319 09:22:56.318764 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:22:56.319372 master-0 kubenswrapper[7457]: I0319 09:22:56.319348 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-224sj" Mar 19 09:22:56.319430 master-0 kubenswrapper[7457]: I0319 09:22:56.319380 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:22:56.320365 master-0 kubenswrapper[7457]: I0319 09:22:56.320341 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:22:56.320434 master-0 kubenswrapper[7457]: I0319 09:22:56.320365 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-wfcn9" Mar 19 09:22:56.320434 master-0 kubenswrapper[7457]: I0319 09:22:56.320384 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:22:56.321041 master-0 kubenswrapper[7457]: I0319 09:22:56.321025 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:22:56.321133 master-0 kubenswrapper[7457]: I0319 09:22:56.321098 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:22:56.321174 master-0 kubenswrapper[7457]: I0319 09:22:56.321058 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:22:56.321245 master-0 kubenswrapper[7457]: I0319 09:22:56.321066 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:22:56.331779 master-0 kubenswrapper[7457]: I0319 09:22:56.331737 7457 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:22:56.358798 master-0 kubenswrapper[7457]: I0319 09:22:56.358739 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c8fd866bf-g46sj"] Mar 19 09:22:56.360992 master-0 kubenswrapper[7457]: I0319 09:22:56.360958 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr"] Mar 19 09:22:56.415815 master-0 kubenswrapper[7457]: I0319 09:22:56.415760 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-config\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.415815 master-0 kubenswrapper[7457]: I0319 09:22:56.415812 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-client-ca\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.415815 master-0 kubenswrapper[7457]: I0319 09:22:56.415833 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d66357-fcee-4e70-b563-5895b978ab55-serving-cert\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.416062 master-0 kubenswrapper[7457]: I0319 09:22:56.415897 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-client-ca\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.416062 master-0 kubenswrapper[7457]: I0319 09:22:56.415944 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-config\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.416062 master-0 kubenswrapper[7457]: I0319 09:22:56.415964 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49fpz\" (UniqueName: \"kubernetes.io/projected/1d5e311c-1c6a-4d5d-8c2b-493025593934-kube-api-access-49fpz\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.416062 master-0 kubenswrapper[7457]: I0319 09:22:56.415990 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-proxy-ca-bundles\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.416062 master-0 kubenswrapper[7457]: I0319 09:22:56.416022 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5e311c-1c6a-4d5d-8c2b-493025593934-serving-cert\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.416062 master-0 kubenswrapper[7457]: I0319 09:22:56.416039 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sclqq\" (UniqueName: \"kubernetes.io/projected/67d66357-fcee-4e70-b563-5895b978ab55-kube-api-access-sclqq\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.517167 master-0 kubenswrapper[7457]: I0319 09:22:56.517037 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sclqq\" (UniqueName: \"kubernetes.io/projected/67d66357-fcee-4e70-b563-5895b978ab55-kube-api-access-sclqq\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.517167 master-0 kubenswrapper[7457]: I0319 09:22:56.517161 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5e311c-1c6a-4d5d-8c2b-493025593934-serving-cert\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.517606 master-0 kubenswrapper[7457]: I0319 09:22:56.517429 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-config\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.517730 master-0 kubenswrapper[7457]: I0319 09:22:56.517631 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-client-ca\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.517730 master-0 kubenswrapper[7457]: I0319 09:22:56.517713 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d66357-fcee-4e70-b563-5895b978ab55-serving-cert\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.517951 master-0 kubenswrapper[7457]: I0319 09:22:56.517771 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-client-ca\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.517951 master-0 kubenswrapper[7457]: I0319 09:22:56.517827 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49fpz\" (UniqueName: \"kubernetes.io/projected/1d5e311c-1c6a-4d5d-8c2b-493025593934-kube-api-access-49fpz\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.517951 master-0 kubenswrapper[7457]: I0319 09:22:56.517876 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-config\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.517951 master-0 kubenswrapper[7457]: I0319 09:22:56.517919 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-proxy-ca-bundles\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.519951 master-0 kubenswrapper[7457]: I0319 09:22:56.519867 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-client-ca\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.520104 master-0 kubenswrapper[7457]: I0319 09:22:56.520032 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-config\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.520533 master-0 kubenswrapper[7457]: I0319 09:22:56.520471 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-proxy-ca-bundles\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.520801 master-0 kubenswrapper[7457]: I0319 09:22:56.520733 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-config\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.521205 master-0 kubenswrapper[7457]: I0319 09:22:56.521147 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-client-ca\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.523107 master-0 kubenswrapper[7457]: I0319 09:22:56.523035 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5e311c-1c6a-4d5d-8c2b-493025593934-serving-cert\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.523721 master-0 kubenswrapper[7457]: I0319 09:22:56.523671 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d66357-fcee-4e70-b563-5895b978ab55-serving-cert\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.799512 master-0 kubenswrapper[7457]: I0319 09:22:56.794300 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49fpz\" (UniqueName: \"kubernetes.io/projected/1d5e311c-1c6a-4d5d-8c2b-493025593934-kube-api-access-49fpz\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.803001 master-0 kubenswrapper[7457]: I0319 09:22:56.802947 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sclqq\" (UniqueName: \"kubernetes.io/projected/67d66357-fcee-4e70-b563-5895b978ab55-kube-api-access-sclqq\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:56.940864 master-0 kubenswrapper[7457]: I0319 09:22:56.940788 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:56.959630 master-0 kubenswrapper[7457]: I0319 09:22:56.959270 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:57.439632 master-0 kubenswrapper[7457]: I0319 09:22:57.439519 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c8fd866bf-g46sj"] Mar 19 09:22:57.450703 master-0 kubenswrapper[7457]: W0319 09:22:57.450614 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d5e311c_1c6a_4d5d_8c2b_493025593934.slice/crio-3eefb4e7e53cb0d4c4a18064ef7d910510d0608a89ca096908a4ffccd0aaebda WatchSource:0}: Error finding container 3eefb4e7e53cb0d4c4a18064ef7d910510d0608a89ca096908a4ffccd0aaebda: Status 404 returned error can't find the container with id 3eefb4e7e53cb0d4c4a18064ef7d910510d0608a89ca096908a4ffccd0aaebda Mar 19 09:22:57.534307 master-0 kubenswrapper[7457]: I0319 09:22:57.534221 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr"] Mar 19 09:22:57.541889 master-0 kubenswrapper[7457]: W0319 09:22:57.541846 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67d66357_fcee_4e70_b563_5895b978ab55.slice/crio-6bf8f167b730f8b123fa119481aeceac0bccae7e125576f133fb9531cd659c54 WatchSource:0}: Error finding container 6bf8f167b730f8b123fa119481aeceac0bccae7e125576f133fb9531cd659c54: Status 404 returned error can't find the container with id 6bf8f167b730f8b123fa119481aeceac0bccae7e125576f133fb9531cd659c54 Mar 19 09:22:57.667831 master-0 kubenswrapper[7457]: I0319 09:22:57.667768 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" event={"ID":"67d66357-fcee-4e70-b563-5895b978ab55","Type":"ContainerStarted","Data":"6bf8f167b730f8b123fa119481aeceac0bccae7e125576f133fb9531cd659c54"} Mar 19 09:22:57.668523 master-0 kubenswrapper[7457]: I0319 09:22:57.668480 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" event={"ID":"1d5e311c-1c6a-4d5d-8c2b-493025593934","Type":"ContainerStarted","Data":"3eefb4e7e53cb0d4c4a18064ef7d910510d0608a89ca096908a4ffccd0aaebda"} Mar 19 09:22:57.815954 master-0 kubenswrapper[7457]: I0319 09:22:57.815364 7457 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:57.815954 master-0 kubenswrapper[7457]: I0319 09:22:57.815586 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:22:57.816438 master-0 kubenswrapper[7457]: I0319 09:22:57.816383 7457 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 19 09:22:57.816755 master-0 kubenswrapper[7457]: I0319 09:22:57.816708 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" containerID="cri-o://27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a" gracePeriod=30 Mar 19 09:22:58.168678 master-0 kubenswrapper[7457]: I0319 09:22:58.168606 7457 patch_prober.go:28] interesting pod/authentication-operator-5885bfd7f4-k4dfd container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:22:58.168872 master-0 kubenswrapper[7457]: I0319 09:22:58.168707 7457 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" podUID="e7fae040-28fa-4d97-8482-fd0dd12cc921" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:58.675894 master-0 kubenswrapper[7457]: I0319 09:22:58.675783 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" event={"ID":"67d66357-fcee-4e70-b563-5895b978ab55","Type":"ContainerStarted","Data":"6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194"} Mar 19 09:22:58.676664 master-0 kubenswrapper[7457]: I0319 09:22:58.676103 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:22:58.677997 master-0 kubenswrapper[7457]: I0319 09:22:58.677821 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" event={"ID":"1d5e311c-1c6a-4d5d-8c2b-493025593934","Type":"ContainerStarted","Data":"53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378"} Mar 19 09:22:58.678073 master-0 kubenswrapper[7457]: I0319 09:22:58.678013 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:58.682154 master-0 kubenswrapper[7457]: I0319 09:22:58.682121 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:22:58.682288 master-0 kubenswrapper[7457]: I0319 09:22:58.682239 7457 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a" exitCode=2 Mar 19 09:22:58.682340 master-0 kubenswrapper[7457]: I0319 09:22:58.682313 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a"} Mar 19 09:22:58.682377 master-0 kubenswrapper[7457]: I0319 09:22:58.682362 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb"} Mar 19 09:22:58.682406 master-0 kubenswrapper[7457]: I0319 09:22:58.682391 7457 scope.go:117] "RemoveContainer" containerID="b5a024c432c7340543e69f0bd7bb2379e363c0a3445c80c57fd287fd74ddf6ae" Mar 19 09:22:58.806408 master-0 kubenswrapper[7457]: I0319 09:22:58.806287 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" podStartSLOduration=100.806258228 podStartE2EDuration="1m40.806258228s" podCreationTimestamp="2026-03-19 09:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:22:58.802963661 +0000 UTC m=+174.658303071" watchObservedRunningTime="2026-03-19 09:22:58.806258228 +0000 UTC m=+174.661597638" Mar 19 09:22:58.880368 master-0 kubenswrapper[7457]: I0319 09:22:58.880291 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" podStartSLOduration=100.880272786 podStartE2EDuration="1m40.880272786s" podCreationTimestamp="2026-03-19 09:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:22:58.878379641 +0000 UTC m=+174.733719021" watchObservedRunningTime="2026-03-19 09:22:58.880272786 +0000 UTC m=+174.735612166" Mar 19 09:22:59.676925 master-0 kubenswrapper[7457]: I0319 09:22:59.676508 7457 patch_prober.go:28] interesting pod/route-controller-manager-8555fbf585-9ggfr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:22:59.678074 master-0 kubenswrapper[7457]: I0319 09:22:59.676931 7457 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" podUID="67d66357-fcee-4e70-b563-5895b978ab55" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:23:00.690210 master-0 kubenswrapper[7457]: I0319 09:23:00.690157 7457 patch_prober.go:28] interesting pod/route-controller-manager-8555fbf585-9ggfr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:23:00.690785 master-0 kubenswrapper[7457]: I0319 09:23:00.690223 7457 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" podUID="67d66357-fcee-4e70-b563-5895b978ab55" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:23:01.953006 master-0 kubenswrapper[7457]: I0319 09:23:01.952932 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:23:04.453206 master-0 kubenswrapper[7457]: E0319 09:23:04.453102 7457 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/a69ec2a9d01632a7064b6c9ca3f81260849b84e96ec44da919cb17405e7b22dc/diff" to get inode usage: stat /var/lib/containers/storage/overlay/a69ec2a9d01632a7064b6c9ca3f81260849b84e96ec44da919cb17405e7b22dc/diff: no such file or directory, extraDiskErr: Mar 19 09:23:04.765560 master-0 kubenswrapper[7457]: I0319 09:23:04.765504 7457 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 09:23:04.765775 master-0 kubenswrapper[7457]: I0319 09:23:04.765743 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" containerID="cri-o://a80a075ae2d2bfe0e545df390d9ff0ad18516cad1ed3ad4a716e570d8e5f21c1" gracePeriod=30 Mar 19 09:23:04.766866 master-0 kubenswrapper[7457]: I0319 09:23:04.766825 7457 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:23:04.768012 master-0 kubenswrapper[7457]: E0319 09:23:04.767039 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:23:04.768012 master-0 kubenswrapper[7457]: I0319 09:23:04.767053 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:23:04.768012 master-0 kubenswrapper[7457]: E0319 09:23:04.767071 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:23:04.768012 master-0 kubenswrapper[7457]: I0319 09:23:04.767079 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:23:04.768012 master-0 kubenswrapper[7457]: I0319 09:23:04.767154 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:23:04.768012 master-0 kubenswrapper[7457]: I0319 09:23:04.767165 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:23:04.768012 master-0 kubenswrapper[7457]: I0319 09:23:04.768014 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:04.818288 master-0 kubenswrapper[7457]: I0319 09:23:04.817978 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:23:04.855665 master-0 kubenswrapper[7457]: I0319 09:23:04.853765 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:23:04.857336 master-0 kubenswrapper[7457]: I0319 09:23:04.855997 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:23:04.936128 master-0 kubenswrapper[7457]: I0319 09:23:04.936071 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:04.936128 master-0 kubenswrapper[7457]: I0319 09:23:04.936122 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:04.960170 master-0 kubenswrapper[7457]: I0319 09:23:04.960140 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:23:04.991908 master-0 kubenswrapper[7457]: I0319 09:23:04.991850 7457 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="18acdf99-5890-4e3d-b40f-a9633a69b4e6" Mar 19 09:23:05.037472 master-0 kubenswrapper[7457]: I0319 09:23:05.037355 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:05.037472 master-0 kubenswrapper[7457]: I0319 09:23:05.037397 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:05.037472 master-0 kubenswrapper[7457]: I0319 09:23:05.037457 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:05.037472 master-0 kubenswrapper[7457]: I0319 09:23:05.037461 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:05.138118 master-0 kubenswrapper[7457]: I0319 09:23:05.138059 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"c83737980b9ee109184b1d78e942cf36\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " Mar 19 09:23:05.138118 master-0 kubenswrapper[7457]: I0319 09:23:05.138124 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"c83737980b9ee109184b1d78e942cf36\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " Mar 19 09:23:05.138318 master-0 kubenswrapper[7457]: I0319 09:23:05.138218 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs" (OuterVolumeSpecName: "logs") pod "c83737980b9ee109184b1d78e942cf36" (UID: "c83737980b9ee109184b1d78e942cf36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:05.138318 master-0 kubenswrapper[7457]: I0319 09:23:05.138305 7457 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:05.138381 master-0 kubenswrapper[7457]: I0319 09:23:05.138316 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets" (OuterVolumeSpecName: "secrets") pod "c83737980b9ee109184b1d78e942cf36" (UID: "c83737980b9ee109184b1d78e942cf36"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:05.148148 master-0 kubenswrapper[7457]: I0319 09:23:05.148104 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:05.239454 master-0 kubenswrapper[7457]: I0319 09:23:05.239399 7457 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.354484 7457 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355335 7457 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: E0319 09:23:05.355485 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355494 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: E0319 09:23:05.355503 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355509 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: E0319 09:23:05.355545 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355551 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: E0319 09:23:05.355558 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355563 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: E0319 09:23:05.355575 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355583 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355656 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355664 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355673 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355680 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355687 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: E0319 09:23:05.355761 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355768 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.356606 master-0 kubenswrapper[7457]: I0319 09:23:05.355828 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:23:05.358668 master-0 kubenswrapper[7457]: I0319 09:23:05.357664 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:05.398843 master-0 kubenswrapper[7457]: I0319 09:23:05.398057 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:23:05.543507 master-0 kubenswrapper[7457]: I0319 09:23:05.543368 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:05.543507 master-0 kubenswrapper[7457]: I0319 09:23:05.543448 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:05.644195 master-0 kubenswrapper[7457]: I0319 09:23:05.644127 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:05.644195 master-0 kubenswrapper[7457]: I0319 09:23:05.644203 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:05.644444 master-0 kubenswrapper[7457]: I0319 09:23:05.644283 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:05.644444 master-0 kubenswrapper[7457]: I0319 09:23:05.644324 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:05.698497 master-0 kubenswrapper[7457]: I0319 09:23:05.698380 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:05.711389 master-0 kubenswrapper[7457]: W0319 09:23:05.711251 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda474cbd3d0d9d7ed4d0ff461a5e5fe1a.slice/crio-1132a72c9136c6d33d6382355fa3991b260d8a3776fc503599fe4ecedb8985b2 WatchSource:0}: Error finding container 1132a72c9136c6d33d6382355fa3991b260d8a3776fc503599fe4ecedb8985b2: Status 404 returned error can't find the container with id 1132a72c9136c6d33d6382355fa3991b260d8a3776fc503599fe4ecedb8985b2 Mar 19 09:23:05.724165 master-0 kubenswrapper[7457]: I0319 09:23:05.724092 7457 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="6b51526a63cb4fc4843a03fc75fd50c63454c0795793d3149e658718010b95b1" exitCode=0 Mar 19 09:23:05.724348 master-0 kubenswrapper[7457]: I0319 09:23:05.724280 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerDied","Data":"6b51526a63cb4fc4843a03fc75fd50c63454c0795793d3149e658718010b95b1"} Mar 19 09:23:05.724348 master-0 kubenswrapper[7457]: I0319 09:23:05.724334 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"d6cda39585354e47346ec04d7e9023161d8c669dfe02492069483d076fdb9801"} Mar 19 09:23:05.725293 master-0 kubenswrapper[7457]: I0319 09:23:05.725243 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"1132a72c9136c6d33d6382355fa3991b260d8a3776fc503599fe4ecedb8985b2"} Mar 19 09:23:05.727130 master-0 kubenswrapper[7457]: I0319 09:23:05.726854 7457 generic.go:334] "Generic (PLEG): container finished" podID="43ca4232-9e9c-4b97-9c29-bead80a9a5fa" containerID="46c63e43dc61899ca4cb1732e5d7d4e693a722f5fb486db67fb30cfa5bfc8af5" exitCode=0 Mar 19 09:23:05.727130 master-0 kubenswrapper[7457]: I0319 09:23:05.726956 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"43ca4232-9e9c-4b97-9c29-bead80a9a5fa","Type":"ContainerDied","Data":"46c63e43dc61899ca4cb1732e5d7d4e693a722f5fb486db67fb30cfa5bfc8af5"} Mar 19 09:23:05.729002 master-0 kubenswrapper[7457]: I0319 09:23:05.728932 7457 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="a80a075ae2d2bfe0e545df390d9ff0ad18516cad1ed3ad4a716e570d8e5f21c1" exitCode=0 Mar 19 09:23:05.729098 master-0 kubenswrapper[7457]: I0319 09:23:05.728995 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:23:05.729098 master-0 kubenswrapper[7457]: I0319 09:23:05.729080 7457 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="205c73b1ea5a301df50c88c2833b1992d29a39f06232166d5125d802ffe3e979" Mar 19 09:23:05.729263 master-0 kubenswrapper[7457]: I0319 09:23:05.729116 7457 scope.go:117] "RemoveContainer" containerID="9a59b0cbe8ea8fa4b17a290e74267cd3c1f43f118142de7e624d510bbb389da7" Mar 19 09:23:05.731978 master-0 kubenswrapper[7457]: I0319 09:23:05.731939 7457 generic.go:334] "Generic (PLEG): container finished" podID="014ef8bd-b940-41e2-9239-c238afe6ebae" containerID="63e480bd33c67f5ddbdb4cc89c4a2b081a014d57d3304086d0ed39b6f8f0a797" exitCode=0 Mar 19 09:23:05.732112 master-0 kubenswrapper[7457]: I0319 09:23:05.732074 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"014ef8bd-b940-41e2-9239-c238afe6ebae","Type":"ContainerDied","Data":"63e480bd33c67f5ddbdb4cc89c4a2b081a014d57d3304086d0ed39b6f8f0a797"} Mar 19 09:23:06.341982 master-0 kubenswrapper[7457]: I0319 09:23:06.341612 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83737980b9ee109184b1d78e942cf36" path="/var/lib/kubelet/pods/c83737980b9ee109184b1d78e942cf36/volumes" Mar 19 09:23:06.342527 master-0 kubenswrapper[7457]: I0319 09:23:06.342501 7457 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Mar 19 09:23:06.564839 master-0 kubenswrapper[7457]: I0319 09:23:06.564790 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 09:23:06.564839 master-0 kubenswrapper[7457]: I0319 09:23:06.564827 7457 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="18acdf99-5890-4e3d-b40f-a9633a69b4e6" Mar 19 09:23:06.573904 master-0 kubenswrapper[7457]: I0319 09:23:06.572474 7457 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 09:23:06.573904 master-0 kubenswrapper[7457]: I0319 09:23:06.572517 7457 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="18acdf99-5890-4e3d-b40f-a9633a69b4e6" Mar 19 09:23:06.741798 master-0 kubenswrapper[7457]: I0319 09:23:06.741085 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"793cfb93f2346e0ad23e32cbd1e114aae92c03db2ff0726f899f8a1c39d66416"} Mar 19 09:23:06.741798 master-0 kubenswrapper[7457]: I0319 09:23:06.741129 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"6d8e777ee2c690477b890e212d15377f6f78a023a47f6d1ccdb66d4fd4236c20"} Mar 19 09:23:06.741798 master-0 kubenswrapper[7457]: I0319 09:23:06.741144 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"57919871ecdce20adcf14d4b3e782688c40e27d380e27e5683da1cfdca89a184"} Mar 19 09:23:06.741798 master-0 kubenswrapper[7457]: I0319 09:23:06.741762 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:06.744602 master-0 kubenswrapper[7457]: I0319 09:23:06.744509 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"d2773f59c5e5fc7c4c20d27964b8855d429ffb69ddd44594d1e039aab3c6d9c7"} Mar 19 09:23:06.744602 master-0 kubenswrapper[7457]: I0319 09:23:06.744556 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"1cd8ba1cf946b8e03e8d14ad1a9ca15bc751df12a73a64e9d4a3982985753d17"} Mar 19 09:23:06.744602 master-0 kubenswrapper[7457]: I0319 09:23:06.744569 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"9889603cf425a1afe622f697ec4d233d82f7e355b75cc078b65e38e02fed7bd5"} Mar 19 09:23:06.752588 master-0 kubenswrapper[7457]: I0319 09:23:06.752503 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" containerID="cri-o://b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5" gracePeriod=30 Mar 19 09:23:06.752931 master-0 kubenswrapper[7457]: I0319 09:23:06.752796 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" containerID="cri-o://069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb" gracePeriod=30 Mar 19 09:23:06.769371 master-0 kubenswrapper[7457]: I0319 09:23:06.768972 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.768953428 podStartE2EDuration="2.768953428s" podCreationTimestamp="2026-03-19 09:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:23:06.765626141 +0000 UTC m=+182.620965541" watchObservedRunningTime="2026-03-19 09:23:06.768953428 +0000 UTC m=+182.624292798" Mar 19 09:23:06.912375 master-0 kubenswrapper[7457]: I0319 09:23:06.911470 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:23:06.963657 master-0 kubenswrapper[7457]: I0319 09:23:06.963567 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:23:07.001369 master-0 kubenswrapper[7457]: I0319 09:23:07.001328 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:23:07.004646 master-0 kubenswrapper[7457]: I0319 09:23:07.004618 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:23:07.059776 master-0 kubenswrapper[7457]: I0319 09:23:07.059728 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 09:23:07.059776 master-0 kubenswrapper[7457]: I0319 09:23:07.059775 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 09:23:07.060001 master-0 kubenswrapper[7457]: I0319 09:23:07.059835 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 09:23:07.060001 master-0 kubenswrapper[7457]: I0319 09:23:07.059856 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 09:23:07.060001 master-0 kubenswrapper[7457]: I0319 09:23:07.059895 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 09:23:07.060001 master-0 kubenswrapper[7457]: I0319 09:23:07.059890 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs" (OuterVolumeSpecName: "logs") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:07.060001 master-0 kubenswrapper[7457]: I0319 09:23:07.059940 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:07.060001 master-0 kubenswrapper[7457]: I0319 09:23:07.059960 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets" (OuterVolumeSpecName: "secrets") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:07.060001 master-0 kubenswrapper[7457]: I0319 09:23:07.059997 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:07.060185 master-0 kubenswrapper[7457]: I0319 09:23:07.060039 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config" (OuterVolumeSpecName: "config") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:07.060185 master-0 kubenswrapper[7457]: I0319 09:23:07.060157 7457 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:07.060185 master-0 kubenswrapper[7457]: I0319 09:23:07.060168 7457 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:07.060185 master-0 kubenswrapper[7457]: I0319 09:23:07.060177 7457 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:07.060185 master-0 kubenswrapper[7457]: I0319 09:23:07.060186 7457 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:07.060316 master-0 kubenswrapper[7457]: I0319 09:23:07.060194 7457 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:07.161097 master-0 kubenswrapper[7457]: I0319 09:23:07.161035 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/014ef8bd-b940-41e2-9239-c238afe6ebae-kube-api-access\") pod \"014ef8bd-b940-41e2-9239-c238afe6ebae\" (UID: \"014ef8bd-b940-41e2-9239-c238afe6ebae\") " Mar 19 09:23:07.161097 master-0 kubenswrapper[7457]: I0319 09:23:07.161101 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-kubelet-dir\") pod \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\" (UID: \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\") " Mar 19 09:23:07.161390 master-0 kubenswrapper[7457]: I0319 09:23:07.161141 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-kube-api-access\") pod \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\" (UID: \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\") " Mar 19 09:23:07.161390 master-0 kubenswrapper[7457]: I0319 09:23:07.161169 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/014ef8bd-b940-41e2-9239-c238afe6ebae-kubelet-dir\") pod \"014ef8bd-b940-41e2-9239-c238afe6ebae\" (UID: \"014ef8bd-b940-41e2-9239-c238afe6ebae\") " Mar 19 09:23:07.161390 master-0 kubenswrapper[7457]: I0319 09:23:07.161224 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/014ef8bd-b940-41e2-9239-c238afe6ebae-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "014ef8bd-b940-41e2-9239-c238afe6ebae" (UID: "014ef8bd-b940-41e2-9239-c238afe6ebae"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:07.161390 master-0 kubenswrapper[7457]: I0319 09:23:07.161250 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "43ca4232-9e9c-4b97-9c29-bead80a9a5fa" (UID: "43ca4232-9e9c-4b97-9c29-bead80a9a5fa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:07.161390 master-0 kubenswrapper[7457]: I0319 09:23:07.161334 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/014ef8bd-b940-41e2-9239-c238afe6ebae-var-lock\") pod \"014ef8bd-b940-41e2-9239-c238afe6ebae\" (UID: \"014ef8bd-b940-41e2-9239-c238afe6ebae\") " Mar 19 09:23:07.161390 master-0 kubenswrapper[7457]: I0319 09:23:07.161364 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-var-lock\") pod \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\" (UID: \"43ca4232-9e9c-4b97-9c29-bead80a9a5fa\") " Mar 19 09:23:07.161762 master-0 kubenswrapper[7457]: I0319 09:23:07.161702 7457 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/014ef8bd-b940-41e2-9239-c238afe6ebae-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:07.161762 master-0 kubenswrapper[7457]: I0319 09:23:07.161718 7457 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:07.161762 master-0 kubenswrapper[7457]: I0319 09:23:07.161729 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/014ef8bd-b940-41e2-9239-c238afe6ebae-var-lock" (OuterVolumeSpecName: "var-lock") pod "014ef8bd-b940-41e2-9239-c238afe6ebae" (UID: "014ef8bd-b940-41e2-9239-c238afe6ebae"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:07.161762 master-0 kubenswrapper[7457]: I0319 09:23:07.161742 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-var-lock" (OuterVolumeSpecName: "var-lock") pod "43ca4232-9e9c-4b97-9c29-bead80a9a5fa" (UID: "43ca4232-9e9c-4b97-9c29-bead80a9a5fa"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:07.166932 master-0 kubenswrapper[7457]: I0319 09:23:07.166874 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "43ca4232-9e9c-4b97-9c29-bead80a9a5fa" (UID: "43ca4232-9e9c-4b97-9c29-bead80a9a5fa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:23:07.168127 master-0 kubenswrapper[7457]: I0319 09:23:07.167752 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014ef8bd-b940-41e2-9239-c238afe6ebae-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "014ef8bd-b940-41e2-9239-c238afe6ebae" (UID: "014ef8bd-b940-41e2-9239-c238afe6ebae"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:23:07.263047 master-0 kubenswrapper[7457]: I0319 09:23:07.262878 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/014ef8bd-b940-41e2-9239-c238afe6ebae-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:07.263047 master-0 kubenswrapper[7457]: I0319 09:23:07.262931 7457 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:07.263047 master-0 kubenswrapper[7457]: I0319 09:23:07.262943 7457 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/014ef8bd-b940-41e2-9239-c238afe6ebae-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:07.263047 master-0 kubenswrapper[7457]: I0319 09:23:07.262952 7457 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/43ca4232-9e9c-4b97-9c29-bead80a9a5fa-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:07.761787 master-0 kubenswrapper[7457]: I0319 09:23:07.761718 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"e7ac13cba0a41afefd1f1913bc7aba4a187c6d99752100ec1e36b10b44ac9c6a"} Mar 19 09:23:07.763790 master-0 kubenswrapper[7457]: I0319 09:23:07.763715 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"43ca4232-9e9c-4b97-9c29-bead80a9a5fa","Type":"ContainerDied","Data":"c82ea247d4a04551474a0bf79f03cd9f98a0925e6c68fa6c0c9d75dba8c1773c"} Mar 19 09:23:07.763837 master-0 kubenswrapper[7457]: I0319 09:23:07.763798 7457 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c82ea247d4a04551474a0bf79f03cd9f98a0925e6c68fa6c0c9d75dba8c1773c" Mar 19 09:23:07.763837 master-0 kubenswrapper[7457]: I0319 09:23:07.763756 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:23:07.765685 master-0 kubenswrapper[7457]: I0319 09:23:07.765650 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:23:07.766124 master-0 kubenswrapper[7457]: I0319 09:23:07.766083 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"014ef8bd-b940-41e2-9239-c238afe6ebae","Type":"ContainerDied","Data":"419f90df85200464073bb55727a37114d61c84e4d555b334b5798b07351fb1d6"} Mar 19 09:23:07.766171 master-0 kubenswrapper[7457]: I0319 09:23:07.766134 7457 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="419f90df85200464073bb55727a37114d61c84e4d555b334b5798b07351fb1d6" Mar 19 09:23:07.771561 master-0 kubenswrapper[7457]: I0319 09:23:07.768563 7457 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb" exitCode=0 Mar 19 09:23:07.771561 master-0 kubenswrapper[7457]: I0319 09:23:07.768589 7457 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5" exitCode=0 Mar 19 09:23:07.771561 master-0 kubenswrapper[7457]: I0319 09:23:07.769207 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:23:07.775943 master-0 kubenswrapper[7457]: I0319 09:23:07.771888 7457 scope.go:117] "RemoveContainer" containerID="069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb" Mar 19 09:23:07.787728 master-0 kubenswrapper[7457]: I0319 09:23:07.787581 7457 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.787558665 podStartE2EDuration="2.787558665s" podCreationTimestamp="2026-03-19 09:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:23:07.783841739 +0000 UTC m=+183.639181119" watchObservedRunningTime="2026-03-19 09:23:07.787558665 +0000 UTC m=+183.642898055" Mar 19 09:23:07.795628 master-0 kubenswrapper[7457]: I0319 09:23:07.795604 7457 scope.go:117] "RemoveContainer" containerID="27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a" Mar 19 09:23:07.823193 master-0 kubenswrapper[7457]: I0319 09:23:07.823159 7457 scope.go:117] "RemoveContainer" containerID="b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5" Mar 19 09:23:07.849348 master-0 kubenswrapper[7457]: I0319 09:23:07.849309 7457 scope.go:117] "RemoveContainer" containerID="069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb" Mar 19 09:23:07.849812 master-0 kubenswrapper[7457]: E0319 09:23:07.849763 7457 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb\": container with ID starting with 069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb not found: ID does not exist" containerID="069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb" Mar 19 09:23:07.849877 master-0 kubenswrapper[7457]: I0319 09:23:07.849818 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb"} err="failed to get container status \"069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb\": rpc error: code = NotFound desc = could not find container \"069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb\": container with ID starting with 069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb not found: ID does not exist" Mar 19 09:23:07.849877 master-0 kubenswrapper[7457]: I0319 09:23:07.849845 7457 scope.go:117] "RemoveContainer" containerID="27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a" Mar 19 09:23:07.850288 master-0 kubenswrapper[7457]: E0319 09:23:07.850257 7457 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a\": container with ID starting with 27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a not found: ID does not exist" containerID="27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a" Mar 19 09:23:07.850288 master-0 kubenswrapper[7457]: I0319 09:23:07.850280 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a"} err="failed to get container status \"27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a\": rpc error: code = NotFound desc = could not find container \"27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a\": container with ID starting with 27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a not found: ID does not exist" Mar 19 09:23:07.850355 master-0 kubenswrapper[7457]: I0319 09:23:07.850295 7457 scope.go:117] "RemoveContainer" containerID="b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5" Mar 19 09:23:07.850623 master-0 kubenswrapper[7457]: E0319 09:23:07.850589 7457 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5\": container with ID starting with b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5 not found: ID does not exist" containerID="b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5" Mar 19 09:23:07.850680 master-0 kubenswrapper[7457]: I0319 09:23:07.850619 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5"} err="failed to get container status \"b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5\": rpc error: code = NotFound desc = could not find container \"b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5\": container with ID starting with b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5 not found: ID does not exist" Mar 19 09:23:07.850680 master-0 kubenswrapper[7457]: I0319 09:23:07.850635 7457 scope.go:117] "RemoveContainer" containerID="069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb" Mar 19 09:23:07.851094 master-0 kubenswrapper[7457]: I0319 09:23:07.851027 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb"} err="failed to get container status \"069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb\": rpc error: code = NotFound desc = could not find container \"069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb\": container with ID starting with 069cd46511f1496d563603488879e03c4e434fd845682e28137208201ed446fb not found: ID does not exist" Mar 19 09:23:07.851094 master-0 kubenswrapper[7457]: I0319 09:23:07.851082 7457 scope.go:117] "RemoveContainer" containerID="27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a" Mar 19 09:23:07.851596 master-0 kubenswrapper[7457]: I0319 09:23:07.851560 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a"} err="failed to get container status \"27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a\": rpc error: code = NotFound desc = could not find container \"27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a\": container with ID starting with 27eb93b111442dd1018d18a9717ccaf342f348dc115b6c92da0327d6e54d568a not found: ID does not exist" Mar 19 09:23:07.851596 master-0 kubenswrapper[7457]: I0319 09:23:07.851587 7457 scope.go:117] "RemoveContainer" containerID="b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5" Mar 19 09:23:07.851889 master-0 kubenswrapper[7457]: I0319 09:23:07.851864 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5"} err="failed to get container status \"b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5\": rpc error: code = NotFound desc = could not find container \"b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5\": container with ID starting with b50a3ad4aa8854403d338260a3eadefb28f4cb53b9e9991f8777218cdd5183f5 not found: ID does not exist" Mar 19 09:23:08.340307 master-0 kubenswrapper[7457]: I0319 09:23:08.340237 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f265536aba6292ead501bc9b49f327" path="/var/lib/kubelet/pods/46f265536aba6292ead501bc9b49f327/volumes" Mar 19 09:23:08.340708 master-0 kubenswrapper[7457]: I0319 09:23:08.340598 7457 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Mar 19 09:23:08.354884 master-0 kubenswrapper[7457]: I0319 09:23:08.354848 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 09:23:08.355116 master-0 kubenswrapper[7457]: I0319 09:23:08.355097 7457 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="abdfda4f-7e5b-456a-8b66-99681741e37c" Mar 19 09:23:08.358089 master-0 kubenswrapper[7457]: I0319 09:23:08.358060 7457 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 09:23:08.358209 master-0 kubenswrapper[7457]: I0319 09:23:08.358194 7457 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="abdfda4f-7e5b-456a-8b66-99681741e37c" Mar 19 09:23:12.488680 master-0 kubenswrapper[7457]: I0319 09:23:12.488351 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7dmw4"] Mar 19 09:23:12.489737 master-0 kubenswrapper[7457]: E0319 09:23:12.488783 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014ef8bd-b940-41e2-9239-c238afe6ebae" containerName="installer" Mar 19 09:23:12.489737 master-0 kubenswrapper[7457]: I0319 09:23:12.488794 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="014ef8bd-b940-41e2-9239-c238afe6ebae" containerName="installer" Mar 19 09:23:12.489737 master-0 kubenswrapper[7457]: E0319 09:23:12.488810 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ca4232-9e9c-4b97-9c29-bead80a9a5fa" containerName="installer" Mar 19 09:23:12.489737 master-0 kubenswrapper[7457]: I0319 09:23:12.488817 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ca4232-9e9c-4b97-9c29-bead80a9a5fa" containerName="installer" Mar 19 09:23:12.489737 master-0 kubenswrapper[7457]: I0319 09:23:12.488888 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="014ef8bd-b940-41e2-9239-c238afe6ebae" containerName="installer" Mar 19 09:23:12.489737 master-0 kubenswrapper[7457]: I0319 09:23:12.488905 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ca4232-9e9c-4b97-9c29-bead80a9a5fa" containerName="installer" Mar 19 09:23:12.489737 master-0 kubenswrapper[7457]: I0319 09:23:12.489468 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:12.492610 master-0 kubenswrapper[7457]: I0319 09:23:12.492568 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-9tw96" Mar 19 09:23:12.495277 master-0 kubenswrapper[7457]: I0319 09:23:12.495214 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2ct9k"] Mar 19 09:23:12.498114 master-0 kubenswrapper[7457]: I0319 09:23:12.496666 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:12.500242 master-0 kubenswrapper[7457]: I0319 09:23:12.500202 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-p8jxl" Mar 19 09:23:12.527750 master-0 kubenswrapper[7457]: I0319 09:23:12.527699 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7dmw4"] Mar 19 09:23:12.528757 master-0 kubenswrapper[7457]: I0319 09:23:12.528681 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2ct9k"] Mar 19 09:23:12.621910 master-0 kubenswrapper[7457]: I0319 09:23:12.621813 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kr8w\" (UniqueName: \"kubernetes.io/projected/0bce9154-cd31-4c4a-9d86-2903d5b1adad-kube-api-access-4kr8w\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:12.621910 master-0 kubenswrapper[7457]: I0319 09:23:12.621890 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-utilities\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:12.621910 master-0 kubenswrapper[7457]: I0319 09:23:12.621935 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-catalog-content\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:12.622426 master-0 kubenswrapper[7457]: I0319 09:23:12.622089 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-utilities\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:12.622426 master-0 kubenswrapper[7457]: I0319 09:23:12.622158 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-catalog-content\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:12.622426 master-0 kubenswrapper[7457]: I0319 09:23:12.622247 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6j2m\" (UniqueName: \"kubernetes.io/projected/4d2c5580-36f6-4107-af53-cfbd15080b30-kube-api-access-x6j2m\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:12.723181 master-0 kubenswrapper[7457]: I0319 09:23:12.723107 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6j2m\" (UniqueName: \"kubernetes.io/projected/4d2c5580-36f6-4107-af53-cfbd15080b30-kube-api-access-x6j2m\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:12.723181 master-0 kubenswrapper[7457]: I0319 09:23:12.723156 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kr8w\" (UniqueName: \"kubernetes.io/projected/0bce9154-cd31-4c4a-9d86-2903d5b1adad-kube-api-access-4kr8w\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:12.723181 master-0 kubenswrapper[7457]: I0319 09:23:12.723180 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-utilities\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:12.723181 master-0 kubenswrapper[7457]: I0319 09:23:12.723202 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-catalog-content\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:12.723738 master-0 kubenswrapper[7457]: I0319 09:23:12.723455 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-utilities\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:12.723981 master-0 kubenswrapper[7457]: I0319 09:23:12.723939 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-catalog-content\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:12.724040 master-0 kubenswrapper[7457]: I0319 09:23:12.723998 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-catalog-content\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:12.724230 master-0 kubenswrapper[7457]: I0319 09:23:12.724195 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-utilities\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:12.724230 master-0 kubenswrapper[7457]: I0319 09:23:12.724217 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-utilities\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:12.724599 master-0 kubenswrapper[7457]: I0319 09:23:12.724567 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-catalog-content\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:12.743653 master-0 kubenswrapper[7457]: I0319 09:23:12.743222 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6j2m\" (UniqueName: \"kubernetes.io/projected/4d2c5580-36f6-4107-af53-cfbd15080b30-kube-api-access-x6j2m\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:12.743990 master-0 kubenswrapper[7457]: I0319 09:23:12.743951 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kr8w\" (UniqueName: \"kubernetes.io/projected/0bce9154-cd31-4c4a-9d86-2903d5b1adad-kube-api-access-4kr8w\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:12.817157 master-0 kubenswrapper[7457]: I0319 09:23:12.817065 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:12.839935 master-0 kubenswrapper[7457]: I0319 09:23:12.838892 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:13.033151 master-0 kubenswrapper[7457]: I0319 09:23:13.032907 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7dmw4"] Mar 19 09:23:13.040988 master-0 kubenswrapper[7457]: W0319 09:23:13.040919 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bce9154_cd31_4c4a_9d86_2903d5b1adad.slice/crio-b34e9a2b33556321559c3a7fd34bd69e2a162921e3a485dc8edf1c710c34dfa7 WatchSource:0}: Error finding container b34e9a2b33556321559c3a7fd34bd69e2a162921e3a485dc8edf1c710c34dfa7: Status 404 returned error can't find the container with id b34e9a2b33556321559c3a7fd34bd69e2a162921e3a485dc8edf1c710c34dfa7 Mar 19 09:23:13.059894 master-0 kubenswrapper[7457]: I0319 09:23:13.059808 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2ct9k"] Mar 19 09:23:13.801618 master-0 kubenswrapper[7457]: I0319 09:23:13.801260 7457 generic.go:334] "Generic (PLEG): container finished" podID="4d2c5580-36f6-4107-af53-cfbd15080b30" containerID="82319940bf8e72e7e1c996daea2af1c07c45f38503055b429ac09e5abb8f28d6" exitCode=0 Mar 19 09:23:13.801618 master-0 kubenswrapper[7457]: I0319 09:23:13.801360 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2ct9k" event={"ID":"4d2c5580-36f6-4107-af53-cfbd15080b30","Type":"ContainerDied","Data":"82319940bf8e72e7e1c996daea2af1c07c45f38503055b429ac09e5abb8f28d6"} Mar 19 09:23:13.802908 master-0 kubenswrapper[7457]: I0319 09:23:13.801672 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2ct9k" event={"ID":"4d2c5580-36f6-4107-af53-cfbd15080b30","Type":"ContainerStarted","Data":"813a77628cdb690ef9ed760c21cb05d1f17fab6329f59eb55493fe5e4d55f0d3"} Mar 19 09:23:13.810918 master-0 kubenswrapper[7457]: I0319 09:23:13.810858 7457 generic.go:334] "Generic (PLEG): container finished" podID="0bce9154-cd31-4c4a-9d86-2903d5b1adad" containerID="bf149ff2c777ec19da6a404f555dbbecaec9d99f5badeb4692ea25e2aab65ea8" exitCode=0 Mar 19 09:23:13.810918 master-0 kubenswrapper[7457]: I0319 09:23:13.810930 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dmw4" event={"ID":"0bce9154-cd31-4c4a-9d86-2903d5b1adad","Type":"ContainerDied","Data":"bf149ff2c777ec19da6a404f555dbbecaec9d99f5badeb4692ea25e2aab65ea8"} Mar 19 09:23:13.811381 master-0 kubenswrapper[7457]: I0319 09:23:13.810976 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dmw4" event={"ID":"0bce9154-cd31-4c4a-9d86-2903d5b1adad","Type":"ContainerStarted","Data":"b34e9a2b33556321559c3a7fd34bd69e2a162921e3a485dc8edf1c710c34dfa7"} Mar 19 09:23:14.043358 master-0 kubenswrapper[7457]: I0319 09:23:14.043293 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-995hm"] Mar 19 09:23:14.044279 master-0 kubenswrapper[7457]: I0319 09:23:14.044253 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:14.047591 master-0 kubenswrapper[7457]: I0319 09:23:14.046329 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5966fa8-b9f0-42ee-a75b-20014782366d-utilities\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:14.047591 master-0 kubenswrapper[7457]: I0319 09:23:14.046435 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6sr8\" (UniqueName: \"kubernetes.io/projected/c5966fa8-b9f0-42ee-a75b-20014782366d-kube-api-access-v6sr8\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:14.047591 master-0 kubenswrapper[7457]: I0319 09:23:14.046492 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5966fa8-b9f0-42ee-a75b-20014782366d-catalog-content\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:14.047591 master-0 kubenswrapper[7457]: I0319 09:23:14.046795 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-s9ktx" Mar 19 09:23:14.059276 master-0 kubenswrapper[7457]: I0319 09:23:14.058966 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-995hm"] Mar 19 09:23:14.150276 master-0 kubenswrapper[7457]: I0319 09:23:14.149308 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5966fa8-b9f0-42ee-a75b-20014782366d-utilities\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:14.150276 master-0 kubenswrapper[7457]: I0319 09:23:14.149449 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5966fa8-b9f0-42ee-a75b-20014782366d-utilities\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:14.150276 master-0 kubenswrapper[7457]: I0319 09:23:14.149511 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6sr8\" (UniqueName: \"kubernetes.io/projected/c5966fa8-b9f0-42ee-a75b-20014782366d-kube-api-access-v6sr8\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:14.150276 master-0 kubenswrapper[7457]: I0319 09:23:14.149581 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5966fa8-b9f0-42ee-a75b-20014782366d-catalog-content\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:14.150276 master-0 kubenswrapper[7457]: I0319 09:23:14.149952 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5966fa8-b9f0-42ee-a75b-20014782366d-catalog-content\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:14.173434 master-0 kubenswrapper[7457]: I0319 09:23:14.173324 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6sr8\" (UniqueName: \"kubernetes.io/projected/c5966fa8-b9f0-42ee-a75b-20014782366d-kube-api-access-v6sr8\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:14.409891 master-0 kubenswrapper[7457]: I0319 09:23:14.409747 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:14.608250 master-0 kubenswrapper[7457]: I0319 09:23:14.608193 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-995hm"] Mar 19 09:23:14.617487 master-0 kubenswrapper[7457]: W0319 09:23:14.617443 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5966fa8_b9f0_42ee_a75b_20014782366d.slice/crio-915745b7520cddc649a8755f21f9aee4e835c7048a1ccbf6eea3461a98982a5e WatchSource:0}: Error finding container 915745b7520cddc649a8755f21f9aee4e835c7048a1ccbf6eea3461a98982a5e: Status 404 returned error can't find the container with id 915745b7520cddc649a8755f21f9aee4e835c7048a1ccbf6eea3461a98982a5e Mar 19 09:23:14.822816 master-0 kubenswrapper[7457]: I0319 09:23:14.822414 7457 generic.go:334] "Generic (PLEG): container finished" podID="c5966fa8-b9f0-42ee-a75b-20014782366d" containerID="1d94d4e69569ac4f86a917501b5ce54c3042abc1e756a92eeb7e23135f068b96" exitCode=0 Mar 19 09:23:14.822816 master-0 kubenswrapper[7457]: I0319 09:23:14.822512 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-995hm" event={"ID":"c5966fa8-b9f0-42ee-a75b-20014782366d","Type":"ContainerDied","Data":"1d94d4e69569ac4f86a917501b5ce54c3042abc1e756a92eeb7e23135f068b96"} Mar 19 09:23:14.822816 master-0 kubenswrapper[7457]: I0319 09:23:14.822784 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-995hm" event={"ID":"c5966fa8-b9f0-42ee-a75b-20014782366d","Type":"ContainerStarted","Data":"915745b7520cddc649a8755f21f9aee4e835c7048a1ccbf6eea3461a98982a5e"} Mar 19 09:23:15.049876 master-0 kubenswrapper[7457]: I0319 09:23:15.048887 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4gs4g"] Mar 19 09:23:15.057405 master-0 kubenswrapper[7457]: I0319 09:23:15.056006 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4gs4g"] Mar 19 09:23:15.057405 master-0 kubenswrapper[7457]: I0319 09:23:15.056307 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:15.061833 master-0 kubenswrapper[7457]: I0319 09:23:15.060719 7457 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-2svn2" Mar 19 09:23:15.162208 master-0 kubenswrapper[7457]: I0319 09:23:15.162154 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-utilities\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:15.162395 master-0 kubenswrapper[7457]: I0319 09:23:15.162219 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hxq7\" (UniqueName: \"kubernetes.io/projected/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-kube-api-access-6hxq7\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:15.162395 master-0 kubenswrapper[7457]: I0319 09:23:15.162256 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-catalog-content\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:15.263656 master-0 kubenswrapper[7457]: I0319 09:23:15.263565 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-utilities\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:15.263656 master-0 kubenswrapper[7457]: I0319 09:23:15.263658 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hxq7\" (UniqueName: \"kubernetes.io/projected/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-kube-api-access-6hxq7\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:15.264062 master-0 kubenswrapper[7457]: I0319 09:23:15.263705 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-catalog-content\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:15.264539 master-0 kubenswrapper[7457]: I0319 09:23:15.264271 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-catalog-content\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:15.264539 master-0 kubenswrapper[7457]: I0319 09:23:15.264512 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-utilities\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:15.285356 master-0 kubenswrapper[7457]: I0319 09:23:15.285278 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hxq7\" (UniqueName: \"kubernetes.io/projected/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-kube-api-access-6hxq7\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:15.380425 master-0 kubenswrapper[7457]: I0319 09:23:15.380291 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:15.698604 master-0 kubenswrapper[7457]: I0319 09:23:15.698479 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:15.698604 master-0 kubenswrapper[7457]: I0319 09:23:15.698551 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:15.698604 master-0 kubenswrapper[7457]: I0319 09:23:15.698565 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:15.698604 master-0 kubenswrapper[7457]: I0319 09:23:15.698576 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:15.702617 master-0 kubenswrapper[7457]: I0319 09:23:15.702591 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:15.705013 master-0 kubenswrapper[7457]: I0319 09:23:15.704981 7457 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:15.799865 master-0 kubenswrapper[7457]: I0319 09:23:15.799574 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4gs4g"] Mar 19 09:23:15.831191 master-0 kubenswrapper[7457]: I0319 09:23:15.831076 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gs4g" event={"ID":"bf5dde46-8a95-46a6-bee5-20d3a58f33ee","Type":"ContainerStarted","Data":"ecba3762dd2d103496ed9fed52be51c550935d62b9dab4b76da7f92f8e0395b8"} Mar 19 09:23:15.835346 master-0 kubenswrapper[7457]: I0319 09:23:15.835180 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:16.832508 master-0 kubenswrapper[7457]: I0319 09:23:16.832453 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7dmw4"] Mar 19 09:23:16.846654 master-0 kubenswrapper[7457]: I0319 09:23:16.846600 7457 generic.go:334] "Generic (PLEG): container finished" podID="bf5dde46-8a95-46a6-bee5-20d3a58f33ee" containerID="6c747f057ccd974ebffe1ad8f45ae4d2b2720a2e2e97d3a1aa69720b2461f5fb" exitCode=0 Mar 19 09:23:16.848483 master-0 kubenswrapper[7457]: I0319 09:23:16.848447 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gs4g" event={"ID":"bf5dde46-8a95-46a6-bee5-20d3a58f33ee","Type":"ContainerDied","Data":"6c747f057ccd974ebffe1ad8f45ae4d2b2720a2e2e97d3a1aa69720b2461f5fb"} Mar 19 09:23:16.852110 master-0 kubenswrapper[7457]: I0319 09:23:16.852028 7457 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:17.245442 master-0 kubenswrapper[7457]: I0319 09:23:17.245284 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xr42z"] Mar 19 09:23:17.246294 master-0 kubenswrapper[7457]: I0319 09:23:17.246264 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:17.270846 master-0 kubenswrapper[7457]: I0319 09:23:17.258769 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xr42z"] Mar 19 09:23:17.390447 master-0 kubenswrapper[7457]: I0319 09:23:17.390375 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741c9d25-7634-41c0-bfe4-b7a15de4b341-catalog-content\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:17.390447 master-0 kubenswrapper[7457]: I0319 09:23:17.390463 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w7jx\" (UniqueName: \"kubernetes.io/projected/741c9d25-7634-41c0-bfe4-b7a15de4b341-kube-api-access-4w7jx\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:17.390969 master-0 kubenswrapper[7457]: I0319 09:23:17.390502 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741c9d25-7634-41c0-bfe4-b7a15de4b341-utilities\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:17.492478 master-0 kubenswrapper[7457]: I0319 09:23:17.492398 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w7jx\" (UniqueName: \"kubernetes.io/projected/741c9d25-7634-41c0-bfe4-b7a15de4b341-kube-api-access-4w7jx\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:17.492478 master-0 kubenswrapper[7457]: I0319 09:23:17.492498 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741c9d25-7634-41c0-bfe4-b7a15de4b341-utilities\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:17.493031 master-0 kubenswrapper[7457]: I0319 09:23:17.492580 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741c9d25-7634-41c0-bfe4-b7a15de4b341-catalog-content\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:17.493367 master-0 kubenswrapper[7457]: I0319 09:23:17.493331 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741c9d25-7634-41c0-bfe4-b7a15de4b341-catalog-content\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:17.494447 master-0 kubenswrapper[7457]: I0319 09:23:17.494358 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741c9d25-7634-41c0-bfe4-b7a15de4b341-utilities\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:17.526096 master-0 kubenswrapper[7457]: I0319 09:23:17.525977 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w7jx\" (UniqueName: \"kubernetes.io/projected/741c9d25-7634-41c0-bfe4-b7a15de4b341-kube-api-access-4w7jx\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:17.572464 master-0 kubenswrapper[7457]: I0319 09:23:17.572351 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:17.831416 master-0 kubenswrapper[7457]: I0319 09:23:17.831069 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xr42z"] Mar 19 09:23:17.835344 master-0 kubenswrapper[7457]: I0319 09:23:17.835313 7457 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2ct9k"] Mar 19 09:23:17.840901 master-0 kubenswrapper[7457]: W0319 09:23:17.840864 7457 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod741c9d25_7634_41c0_bfe4_b7a15de4b341.slice/crio-d146d705aef07c62cae684b04e48b6f5db2109ae5a19f2d427637f8db5f61221 WatchSource:0}: Error finding container d146d705aef07c62cae684b04e48b6f5db2109ae5a19f2d427637f8db5f61221: Status 404 returned error can't find the container with id d146d705aef07c62cae684b04e48b6f5db2109ae5a19f2d427637f8db5f61221 Mar 19 09:23:17.854098 master-0 kubenswrapper[7457]: I0319 09:23:17.854062 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr42z" event={"ID":"741c9d25-7634-41c0-bfe4-b7a15de4b341","Type":"ContainerStarted","Data":"d146d705aef07c62cae684b04e48b6f5db2109ae5a19f2d427637f8db5f61221"} Mar 19 09:23:18.246576 master-0 kubenswrapper[7457]: I0319 09:23:18.245836 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wfkb9"] Mar 19 09:23:18.247206 master-0 kubenswrapper[7457]: I0319 09:23:18.246983 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:18.256592 master-0 kubenswrapper[7457]: I0319 09:23:18.256509 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wfkb9"] Mar 19 09:23:18.401883 master-0 kubenswrapper[7457]: I0319 09:23:18.401824 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw2x6\" (UniqueName: \"kubernetes.io/projected/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-kube-api-access-jw2x6\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:18.401883 master-0 kubenswrapper[7457]: I0319 09:23:18.401873 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-utilities\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:18.402096 master-0 kubenswrapper[7457]: I0319 09:23:18.401906 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-catalog-content\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:18.503517 master-0 kubenswrapper[7457]: I0319 09:23:18.503423 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-utilities\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:18.503758 master-0 kubenswrapper[7457]: I0319 09:23:18.503710 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-catalog-content\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:18.503908 master-0 kubenswrapper[7457]: I0319 09:23:18.503885 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw2x6\" (UniqueName: \"kubernetes.io/projected/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-kube-api-access-jw2x6\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:18.504297 master-0 kubenswrapper[7457]: I0319 09:23:18.504276 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-catalog-content\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:18.504336 master-0 kubenswrapper[7457]: I0319 09:23:18.504284 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-utilities\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:18.529882 master-0 kubenswrapper[7457]: I0319 09:23:18.529819 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw2x6\" (UniqueName: \"kubernetes.io/projected/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-kube-api-access-jw2x6\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:18.561739 master-0 kubenswrapper[7457]: I0319 09:23:18.561674 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:18.817305 master-0 kubenswrapper[7457]: I0319 09:23:18.816975 7457 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wfkb9"] Mar 19 09:23:18.877417 master-0 kubenswrapper[7457]: I0319 09:23:18.877370 7457 generic.go:334] "Generic (PLEG): container finished" podID="741c9d25-7634-41c0-bfe4-b7a15de4b341" containerID="e404f4723d8631f073faddbc4262589635650594028697bdf7da895f9918c63d" exitCode=0 Mar 19 09:23:18.878354 master-0 kubenswrapper[7457]: I0319 09:23:18.877455 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr42z" event={"ID":"741c9d25-7634-41c0-bfe4-b7a15de4b341","Type":"ContainerDied","Data":"e404f4723d8631f073faddbc4262589635650594028697bdf7da895f9918c63d"} Mar 19 09:23:18.880675 master-0 kubenswrapper[7457]: I0319 09:23:18.880644 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfkb9" event={"ID":"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2","Type":"ContainerStarted","Data":"c09069a2109d448b73c63e5e3d2a41051b8198531ad6e6a692843369313b17a8"} Mar 19 09:23:19.886510 master-0 kubenswrapper[7457]: I0319 09:23:19.886433 7457 generic.go:334] "Generic (PLEG): container finished" podID="ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2" containerID="6a4a22a5cf9a6cf3b1dd393632748fd3e2677a48e7f3293f0563dbc6ae33d7aa" exitCode=0 Mar 19 09:23:19.886510 master-0 kubenswrapper[7457]: I0319 09:23:19.886473 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfkb9" event={"ID":"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2","Type":"ContainerDied","Data":"6a4a22a5cf9a6cf3b1dd393632748fd3e2677a48e7f3293f0563dbc6ae33d7aa"} Mar 19 09:23:29.946656 master-0 kubenswrapper[7457]: I0319 09:23:29.946075 7457 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:23:29.947645 master-0 kubenswrapper[7457]: I0319 09:23:29.947310 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:29.952046 master-0 kubenswrapper[7457]: I0319 09:23:29.951700 7457 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 09:23:29.952155 master-0 kubenswrapper[7457]: I0319 09:23:29.952026 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" containerID="cri-o://0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34" gracePeriod=15 Mar 19 09:23:29.953066 master-0 kubenswrapper[7457]: I0319 09:23:29.953024 7457 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0" gracePeriod=15 Mar 19 09:23:29.954225 master-0 kubenswrapper[7457]: I0319 09:23:29.954159 7457 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:23:29.956500 master-0 kubenswrapper[7457]: E0319 09:23:29.954813 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 09:23:29.956500 master-0 kubenswrapper[7457]: I0319 09:23:29.954851 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 09:23:29.956500 master-0 kubenswrapper[7457]: E0319 09:23:29.954862 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 09:23:29.956500 master-0 kubenswrapper[7457]: I0319 09:23:29.954869 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 09:23:29.956500 master-0 kubenswrapper[7457]: E0319 09:23:29.954888 7457 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 09:23:29.956500 master-0 kubenswrapper[7457]: I0319 09:23:29.954894 7457 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 09:23:29.956500 master-0 kubenswrapper[7457]: I0319 09:23:29.955085 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 09:23:29.956500 master-0 kubenswrapper[7457]: I0319 09:23:29.955103 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 09:23:29.956500 master-0 kubenswrapper[7457]: I0319 09:23:29.955120 7457 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 09:23:29.957332 master-0 kubenswrapper[7457]: I0319 09:23:29.956964 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:30.033170 master-0 kubenswrapper[7457]: I0319 09:23:30.033094 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:23:30.035781 master-0 kubenswrapper[7457]: I0319 09:23:30.035719 7457 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:23:30.058748 master-0 kubenswrapper[7457]: I0319 09:23:30.058422 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.058994 master-0 kubenswrapper[7457]: I0319 09:23:30.058976 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:30.059075 master-0 kubenswrapper[7457]: I0319 09:23:30.059063 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.059151 master-0 kubenswrapper[7457]: I0319 09:23:30.059140 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:30.059252 master-0 kubenswrapper[7457]: I0319 09:23:30.059233 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.059351 master-0 kubenswrapper[7457]: I0319 09:23:30.059334 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.059503 master-0 kubenswrapper[7457]: I0319 09:23:30.059484 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.059648 master-0 kubenswrapper[7457]: I0319 09:23:30.059629 7457 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:30.160805 master-0 kubenswrapper[7457]: I0319 09:23:30.160746 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:30.160805 master-0 kubenswrapper[7457]: I0319 09:23:30.160808 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.161093 master-0 kubenswrapper[7457]: I0319 09:23:30.160912 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:30.161093 master-0 kubenswrapper[7457]: I0319 09:23:30.161044 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.161093 master-0 kubenswrapper[7457]: I0319 09:23:30.161043 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:30.161093 master-0 kubenswrapper[7457]: I0319 09:23:30.161087 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:30.161212 master-0 kubenswrapper[7457]: I0319 09:23:30.161112 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.161212 master-0 kubenswrapper[7457]: I0319 09:23:30.161171 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.161271 master-0 kubenswrapper[7457]: I0319 09:23:30.161253 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:30.161319 master-0 kubenswrapper[7457]: I0319 09:23:30.161300 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.161359 master-0 kubenswrapper[7457]: I0319 09:23:30.161345 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.161415 master-0 kubenswrapper[7457]: I0319 09:23:30.161388 7457 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.161596 master-0 kubenswrapper[7457]: I0319 09:23:30.161498 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.161596 master-0 kubenswrapper[7457]: I0319 09:23:30.161503 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:30.161596 master-0 kubenswrapper[7457]: I0319 09:23:30.161527 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.161596 master-0 kubenswrapper[7457]: I0319 09:23:30.161575 7457 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:30.324758 master-0 kubenswrapper[7457]: I0319 09:23:30.324698 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:30.324946 master-0 kubenswrapper[7457]: I0319 09:23:30.324717 7457 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:31.016559 master-0 kubenswrapper[7457]: E0319 09:23:31.016479 7457 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:31.017480 master-0 kubenswrapper[7457]: E0319 09:23:31.017425 7457 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:31.017995 master-0 kubenswrapper[7457]: E0319 09:23:31.017945 7457 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:31.018712 master-0 kubenswrapper[7457]: E0319 09:23:31.018662 7457 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:31.019282 master-0 kubenswrapper[7457]: E0319 09:23:31.019237 7457 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:31.019319 master-0 kubenswrapper[7457]: I0319 09:23:31.019285 7457 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 09:23:31.019963 master-0 kubenswrapper[7457]: E0319 09:23:31.019909 7457 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 09:23:31.221587 master-0 kubenswrapper[7457]: E0319 09:23:31.221462 7457 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 09:23:31.622833 master-0 kubenswrapper[7457]: E0319 09:23:31.622787 7457 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 09:23:32.424157 master-0 kubenswrapper[7457]: E0319 09:23:32.424096 7457 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 09:23:33.898629 master-0 kubenswrapper[7457]: E0319 09:23:33.898196 7457 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.189e33c28bfefc98 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:ac3507630eeeca1ec26dca5ed036e3bb,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:23:33.895969944 +0000 UTC m=+209.751309314,LastTimestamp:2026-03-19 09:23:33.895969944 +0000 UTC m=+209.751309314,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:23:33.925107 master-0 kubenswrapper[7457]: I0319 09:23:33.925060 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:23:33.925984 master-0 kubenswrapper[7457]: I0319 09:23:33.925925 7457 status_manager.go:851] "Failed to get status for pod" podUID="ac3507630eeeca1ec26dca5ed036e3bb" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:33.926874 master-0 kubenswrapper[7457]: I0319 09:23:33.926769 7457 status_manager.go:851] "Failed to get status for pod" podUID="95378a840215d5780aa88df876aac909" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:33.974011 master-0 kubenswrapper[7457]: I0319 09:23:33.973683 7457 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0" exitCode=0 Mar 19 09:23:33.974107 master-0 kubenswrapper[7457]: I0319 09:23:33.974017 7457 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34" exitCode=0 Mar 19 09:23:33.974107 master-0 kubenswrapper[7457]: I0319 09:23:33.973785 7457 scope.go:117] "RemoveContainer" containerID="4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0" Mar 19 09:23:33.974107 master-0 kubenswrapper[7457]: I0319 09:23:33.973863 7457 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:23:33.976171 master-0 kubenswrapper[7457]: I0319 09:23:33.976133 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"ac3507630eeeca1ec26dca5ed036e3bb","Type":"ContainerStarted","Data":"8bbb7eb717a10731a76fbab7e75a4760990dac18f169f5c55d4ff290082a576b"} Mar 19 09:23:33.978901 master-0 kubenswrapper[7457]: I0319 09:23:33.978869 7457 generic.go:334] "Generic (PLEG): container finished" podID="ff98fb1e-7a1f-4657-b085-743d6f2d28e2" containerID="f2f4573ac6359250badfecb43e628f11a57ba451127ad683fe2723ca4c3b389c" exitCode=0 Mar 19 09:23:33.978979 master-0 kubenswrapper[7457]: I0319 09:23:33.978956 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"ff98fb1e-7a1f-4657-b085-743d6f2d28e2","Type":"ContainerDied","Data":"f2f4573ac6359250badfecb43e628f11a57ba451127ad683fe2723ca4c3b389c"} Mar 19 09:23:33.979708 master-0 kubenswrapper[7457]: I0319 09:23:33.979658 7457 status_manager.go:851] "Failed to get status for pod" podUID="ac3507630eeeca1ec26dca5ed036e3bb" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:33.980473 master-0 kubenswrapper[7457]: I0319 09:23:33.980409 7457 status_manager.go:851] "Failed to get status for pod" podUID="ff98fb1e-7a1f-4657-b085-743d6f2d28e2" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:33.981104 master-0 kubenswrapper[7457]: I0319 09:23:33.981054 7457 status_manager.go:851] "Failed to get status for pod" podUID="95378a840215d5780aa88df876aac909" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:33.981579 master-0 kubenswrapper[7457]: I0319 09:23:33.981517 7457 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"95378a840215d5780aa88df876aac909","Type":"ContainerStarted","Data":"76a2be65b345aaa03d42847ddf4106be40d256a72f66630810b64aeb72f9c081"} Mar 19 09:23:34.007703 master-0 kubenswrapper[7457]: I0319 09:23:34.007634 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 09:23:34.007703 master-0 kubenswrapper[7457]: I0319 09:23:34.007683 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 09:23:34.007833 master-0 kubenswrapper[7457]: I0319 09:23:34.007722 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 09:23:34.007833 master-0 kubenswrapper[7457]: I0319 09:23:34.007735 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config" (OuterVolumeSpecName: "config") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:34.007833 master-0 kubenswrapper[7457]: I0319 09:23:34.007755 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 09:23:34.007833 master-0 kubenswrapper[7457]: I0319 09:23:34.007789 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:34.007833 master-0 kubenswrapper[7457]: I0319 09:23:34.007788 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:34.008278 master-0 kubenswrapper[7457]: I0319 09:23:34.007820 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs" (OuterVolumeSpecName: "logs") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:34.008278 master-0 kubenswrapper[7457]: I0319 09:23:34.007927 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 09:23:34.008278 master-0 kubenswrapper[7457]: I0319 09:23:34.007999 7457 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 09:23:34.008385 master-0 kubenswrapper[7457]: I0319 09:23:34.008336 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:34.008421 master-0 kubenswrapper[7457]: I0319 09:23:34.008393 7457 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets" (OuterVolumeSpecName: "secrets") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:34.008601 master-0 kubenswrapper[7457]: I0319 09:23:34.008573 7457 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:34.008654 master-0 kubenswrapper[7457]: I0319 09:23:34.008600 7457 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:34.008654 master-0 kubenswrapper[7457]: I0319 09:23:34.008621 7457 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:34.008654 master-0 kubenswrapper[7457]: I0319 09:23:34.008633 7457 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:34.008654 master-0 kubenswrapper[7457]: I0319 09:23:34.008646 7457 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:34.008769 master-0 kubenswrapper[7457]: I0319 09:23:34.008660 7457 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:34.016260 master-0 kubenswrapper[7457]: I0319 09:23:34.013912 7457 scope.go:117] "RemoveContainer" containerID="0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34" Mar 19 09:23:34.025745 master-0 kubenswrapper[7457]: E0319 09:23:34.025679 7457 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 09:23:34.061356 master-0 kubenswrapper[7457]: I0319 09:23:34.061302 7457 scope.go:117] "RemoveContainer" containerID="5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8" Mar 19 09:23:34.138806 master-0 kubenswrapper[7457]: I0319 09:23:34.138685 7457 scope.go:117] "RemoveContainer" containerID="4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0" Mar 19 09:23:34.139255 master-0 kubenswrapper[7457]: E0319 09:23:34.139145 7457 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0\": container with ID starting with 4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0 not found: ID does not exist" containerID="4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0" Mar 19 09:23:34.139255 master-0 kubenswrapper[7457]: I0319 09:23:34.139182 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0"} err="failed to get container status \"4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0\": rpc error: code = NotFound desc = could not find container \"4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0\": container with ID starting with 4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0 not found: ID does not exist" Mar 19 09:23:34.139255 master-0 kubenswrapper[7457]: I0319 09:23:34.139204 7457 scope.go:117] "RemoveContainer" containerID="0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34" Mar 19 09:23:34.139846 master-0 kubenswrapper[7457]: E0319 09:23:34.139808 7457 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34\": container with ID starting with 0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34 not found: ID does not exist" containerID="0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34" Mar 19 09:23:34.139846 master-0 kubenswrapper[7457]: I0319 09:23:34.139838 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34"} err="failed to get container status \"0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34\": rpc error: code = NotFound desc = could not find container \"0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34\": container with ID starting with 0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34 not found: ID does not exist" Mar 19 09:23:34.140021 master-0 kubenswrapper[7457]: I0319 09:23:34.139854 7457 scope.go:117] "RemoveContainer" containerID="5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8" Mar 19 09:23:34.140328 master-0 kubenswrapper[7457]: E0319 09:23:34.140294 7457 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8\": container with ID starting with 5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8 not found: ID does not exist" containerID="5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8" Mar 19 09:23:34.140393 master-0 kubenswrapper[7457]: I0319 09:23:34.140326 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8"} err="failed to get container status \"5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8\": rpc error: code = NotFound desc = could not find container \"5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8\": container with ID starting with 5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8 not found: ID does not exist" Mar 19 09:23:34.140393 master-0 kubenswrapper[7457]: I0319 09:23:34.140342 7457 scope.go:117] "RemoveContainer" containerID="4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0" Mar 19 09:23:34.140778 master-0 kubenswrapper[7457]: I0319 09:23:34.140731 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0"} err="failed to get container status \"4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0\": rpc error: code = NotFound desc = could not find container \"4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0\": container with ID starting with 4fb6e9db1ae3cee2e5d635c4ba7fade76e08dd24937623e0e023f7a10238cba0 not found: ID does not exist" Mar 19 09:23:34.140778 master-0 kubenswrapper[7457]: I0319 09:23:34.140765 7457 scope.go:117] "RemoveContainer" containerID="0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34" Mar 19 09:23:34.140999 master-0 kubenswrapper[7457]: I0319 09:23:34.140967 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34"} err="failed to get container status \"0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34\": rpc error: code = NotFound desc = could not find container \"0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34\": container with ID starting with 0b2f1dac19345b88e62a04fc327c4678467e5c896bbbbb4ef01aac3a406cec34 not found: ID does not exist" Mar 19 09:23:34.140999 master-0 kubenswrapper[7457]: I0319 09:23:34.140988 7457 scope.go:117] "RemoveContainer" containerID="5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8" Mar 19 09:23:34.141277 master-0 kubenswrapper[7457]: I0319 09:23:34.141241 7457 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8"} err="failed to get container status \"5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8\": rpc error: code = NotFound desc = could not find container \"5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8\": container with ID starting with 5ecd12f6040db3688d68ae7133c1ce02aa1f16974167368570dff59e36681cb8 not found: ID does not exist" Mar 19 09:23:34.328028 master-0 kubenswrapper[7457]: I0319 09:23:34.327971 7457 status_manager.go:851] "Failed to get status for pod" podUID="ac3507630eeeca1ec26dca5ed036e3bb" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:34.328867 master-0 kubenswrapper[7457]: I0319 09:23:34.328812 7457 status_manager.go:851] "Failed to get status for pod" podUID="ff98fb1e-7a1f-4657-b085-743d6f2d28e2" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:34.329402 master-0 kubenswrapper[7457]: I0319 09:23:34.329359 7457 status_manager.go:851] "Failed to get status for pod" podUID="95378a840215d5780aa88df876aac909" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:34.339750 master-0 kubenswrapper[7457]: I0319 09:23:34.339687 7457 status_manager.go:851] "Failed to get status for pod" podUID="95378a840215d5780aa88df876aac909" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:34.340360 master-0 kubenswrapper[7457]: I0319 09:23:34.340292 7457 status_manager.go:851] "Failed to get status for pod" podUID="ac3507630eeeca1ec26dca5ed036e3bb" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:34.341792 master-0 kubenswrapper[7457]: I0319 09:23:34.341630 7457 status_manager.go:851] "Failed to get status for pod" podUID="ff98fb1e-7a1f-4657-b085-743d6f2d28e2" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:23:34.344039 master-0 kubenswrapper[7457]: I0319 09:23:34.343989 7457 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49fac1b46a11e49501805e891baae4a9" path="/var/lib/kubelet/pods/49fac1b46a11e49501805e891baae4a9/volumes" Mar 19 09:23:34.344513 master-0 kubenswrapper[7457]: I0319 09:23:34.344398 7457 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 19 09:23:34.345228 master-0 kubenswrapper[7457]: E0319 09:23:34.345187 7457 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/bootstrap-kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:23:34.522422 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 19 09:23:34.526598 master-0 kubenswrapper[7457]: I0319 09:23:34.526073 7457 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:23:34.544119 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 19 09:23:34.544388 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 19 09:23:34.545492 master-0 systemd[1]: kubelet.service: Consumed 21.863s CPU time. Mar 19 09:23:34.561289 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 19 09:23:34.705554 master-0 kubenswrapper[13205]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:23:34.706412 master-0 kubenswrapper[13205]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 09:23:34.706472 master-0 kubenswrapper[13205]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:23:34.706560 master-0 kubenswrapper[13205]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:23:34.706640 master-0 kubenswrapper[13205]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 09:23:34.706698 master-0 kubenswrapper[13205]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:23:34.706969 master-0 kubenswrapper[13205]: I0319 09:23:34.706841 13205 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 09:23:34.709615 master-0 kubenswrapper[13205]: W0319 09:23:34.709598 13205 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:23:34.709697 master-0 kubenswrapper[13205]: W0319 09:23:34.709688 13205 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:23:34.709756 master-0 kubenswrapper[13205]: W0319 09:23:34.709747 13205 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:23:34.709805 master-0 kubenswrapper[13205]: W0319 09:23:34.709797 13205 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:23:34.709860 master-0 kubenswrapper[13205]: W0319 09:23:34.709852 13205 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:23:34.709910 master-0 kubenswrapper[13205]: W0319 09:23:34.709902 13205 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:23:34.709961 master-0 kubenswrapper[13205]: W0319 09:23:34.709953 13205 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:23:34.710027 master-0 kubenswrapper[13205]: W0319 09:23:34.710017 13205 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:23:34.710087 master-0 kubenswrapper[13205]: W0319 09:23:34.710077 13205 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:23:34.710152 master-0 kubenswrapper[13205]: W0319 09:23:34.710141 13205 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:23:34.710214 master-0 kubenswrapper[13205]: W0319 09:23:34.710204 13205 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:23:34.710280 master-0 kubenswrapper[13205]: W0319 09:23:34.710269 13205 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:23:34.710349 master-0 kubenswrapper[13205]: W0319 09:23:34.710338 13205 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:23:34.710409 master-0 kubenswrapper[13205]: W0319 09:23:34.710400 13205 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:23:34.710456 master-0 kubenswrapper[13205]: W0319 09:23:34.710449 13205 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:23:34.710512 master-0 kubenswrapper[13205]: W0319 09:23:34.710502 13205 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:23:34.710632 master-0 kubenswrapper[13205]: W0319 09:23:34.710621 13205 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:23:34.710691 master-0 kubenswrapper[13205]: W0319 09:23:34.710682 13205 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:23:34.710743 master-0 kubenswrapper[13205]: W0319 09:23:34.710735 13205 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:23:34.710793 master-0 kubenswrapper[13205]: W0319 09:23:34.710786 13205 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:23:34.710851 master-0 kubenswrapper[13205]: W0319 09:23:34.710841 13205 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:23:34.710910 master-0 kubenswrapper[13205]: W0319 09:23:34.710902 13205 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:23:34.710970 master-0 kubenswrapper[13205]: W0319 09:23:34.710960 13205 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:23:34.711028 master-0 kubenswrapper[13205]: W0319 09:23:34.711017 13205 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:23:34.711096 master-0 kubenswrapper[13205]: W0319 09:23:34.711084 13205 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:23:34.711159 master-0 kubenswrapper[13205]: W0319 09:23:34.711149 13205 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:23:34.711227 master-0 kubenswrapper[13205]: W0319 09:23:34.711217 13205 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:23:34.711288 master-0 kubenswrapper[13205]: W0319 09:23:34.711278 13205 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:23:34.711359 master-0 kubenswrapper[13205]: W0319 09:23:34.711348 13205 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:23:34.711423 master-0 kubenswrapper[13205]: W0319 09:23:34.711412 13205 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:23:34.711489 master-0 kubenswrapper[13205]: W0319 09:23:34.711478 13205 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:23:34.711581 master-0 kubenswrapper[13205]: W0319 09:23:34.711569 13205 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:23:34.711660 master-0 kubenswrapper[13205]: W0319 09:23:34.711648 13205 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:23:34.711731 master-0 kubenswrapper[13205]: W0319 09:23:34.711721 13205 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:23:34.711798 master-0 kubenswrapper[13205]: W0319 09:23:34.711788 13205 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:23:34.711861 master-0 kubenswrapper[13205]: W0319 09:23:34.711849 13205 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:23:34.711928 master-0 kubenswrapper[13205]: W0319 09:23:34.711916 13205 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:23:34.712009 master-0 kubenswrapper[13205]: W0319 09:23:34.711998 13205 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:23:34.712073 master-0 kubenswrapper[13205]: W0319 09:23:34.712062 13205 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:23:34.712133 master-0 kubenswrapper[13205]: W0319 09:23:34.712123 13205 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:23:34.712207 master-0 kubenswrapper[13205]: W0319 09:23:34.712198 13205 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:23:34.712256 master-0 kubenswrapper[13205]: W0319 09:23:34.712248 13205 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:23:34.712301 master-0 kubenswrapper[13205]: W0319 09:23:34.712293 13205 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:23:34.712365 master-0 kubenswrapper[13205]: W0319 09:23:34.712356 13205 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:23:34.712416 master-0 kubenswrapper[13205]: W0319 09:23:34.712408 13205 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:23:34.712461 master-0 kubenswrapper[13205]: W0319 09:23:34.712453 13205 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:23:34.712508 master-0 kubenswrapper[13205]: W0319 09:23:34.712501 13205 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:23:34.712608 master-0 kubenswrapper[13205]: W0319 09:23:34.712594 13205 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:23:34.712686 master-0 kubenswrapper[13205]: W0319 09:23:34.712676 13205 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:23:34.712753 master-0 kubenswrapper[13205]: W0319 09:23:34.712741 13205 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:23:34.712813 master-0 kubenswrapper[13205]: W0319 09:23:34.712804 13205 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:23:34.712861 master-0 kubenswrapper[13205]: W0319 09:23:34.712854 13205 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:23:34.712904 master-0 kubenswrapper[13205]: W0319 09:23:34.712897 13205 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:23:34.712954 master-0 kubenswrapper[13205]: W0319 09:23:34.712945 13205 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:23:34.713001 master-0 kubenswrapper[13205]: W0319 09:23:34.712993 13205 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:23:34.713047 master-0 kubenswrapper[13205]: W0319 09:23:34.713040 13205 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:23:34.713123 master-0 kubenswrapper[13205]: W0319 09:23:34.713110 13205 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:23:34.713185 master-0 kubenswrapper[13205]: W0319 09:23:34.713177 13205 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:23:34.713229 master-0 kubenswrapper[13205]: W0319 09:23:34.713222 13205 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:23:34.713279 master-0 kubenswrapper[13205]: W0319 09:23:34.713272 13205 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:23:34.713367 master-0 kubenswrapper[13205]: W0319 09:23:34.713358 13205 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:23:34.713411 master-0 kubenswrapper[13205]: W0319 09:23:34.713404 13205 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:23:34.713454 master-0 kubenswrapper[13205]: W0319 09:23:34.713447 13205 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:23:34.713501 master-0 kubenswrapper[13205]: W0319 09:23:34.713493 13205 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:23:34.713575 master-0 kubenswrapper[13205]: W0319 09:23:34.713567 13205 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:23:34.713630 master-0 kubenswrapper[13205]: W0319 09:23:34.713622 13205 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:23:34.713677 master-0 kubenswrapper[13205]: W0319 09:23:34.713669 13205 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:23:34.713720 master-0 kubenswrapper[13205]: W0319 09:23:34.713712 13205 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:23:34.713769 master-0 kubenswrapper[13205]: W0319 09:23:34.713761 13205 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:23:34.713857 master-0 kubenswrapper[13205]: W0319 09:23:34.713844 13205 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:23:34.713929 master-0 kubenswrapper[13205]: W0319 09:23:34.713919 13205 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:23:34.714003 master-0 kubenswrapper[13205]: W0319 09:23:34.713993 13205 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:23:34.714190 master-0 kubenswrapper[13205]: I0319 09:23:34.714172 13205 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 09:23:34.714254 master-0 kubenswrapper[13205]: I0319 09:23:34.714241 13205 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 09:23:34.714303 master-0 kubenswrapper[13205]: I0319 09:23:34.714292 13205 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 09:23:34.714356 master-0 kubenswrapper[13205]: I0319 09:23:34.714346 13205 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 09:23:34.714407 master-0 kubenswrapper[13205]: I0319 09:23:34.714398 13205 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 09:23:34.714460 master-0 kubenswrapper[13205]: I0319 09:23:34.714449 13205 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 09:23:34.714512 master-0 kubenswrapper[13205]: I0319 09:23:34.714502 13205 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 09:23:34.714596 master-0 kubenswrapper[13205]: I0319 09:23:34.714584 13205 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 09:23:34.714652 master-0 kubenswrapper[13205]: I0319 09:23:34.714643 13205 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 09:23:34.714699 master-0 kubenswrapper[13205]: I0319 09:23:34.714690 13205 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 09:23:34.714751 master-0 kubenswrapper[13205]: I0319 09:23:34.714743 13205 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 09:23:34.714796 master-0 kubenswrapper[13205]: I0319 09:23:34.714788 13205 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 09:23:34.714846 master-0 kubenswrapper[13205]: I0319 09:23:34.714838 13205 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 09:23:34.714898 master-0 kubenswrapper[13205]: I0319 09:23:34.714890 13205 flags.go:64] FLAG: --cgroup-root="" Mar 19 09:23:34.714948 master-0 kubenswrapper[13205]: I0319 09:23:34.714940 13205 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 09:23:34.714996 master-0 kubenswrapper[13205]: I0319 09:23:34.714988 13205 flags.go:64] FLAG: --client-ca-file="" Mar 19 09:23:34.715045 master-0 kubenswrapper[13205]: I0319 09:23:34.715037 13205 flags.go:64] FLAG: --cloud-config="" Mar 19 09:23:34.715098 master-0 kubenswrapper[13205]: I0319 09:23:34.715090 13205 flags.go:64] FLAG: --cloud-provider="" Mar 19 09:23:34.715146 master-0 kubenswrapper[13205]: I0319 09:23:34.715134 13205 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 09:23:34.715195 master-0 kubenswrapper[13205]: I0319 09:23:34.715187 13205 flags.go:64] FLAG: --cluster-domain="" Mar 19 09:23:34.715245 master-0 kubenswrapper[13205]: I0319 09:23:34.715237 13205 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 09:23:34.715290 master-0 kubenswrapper[13205]: I0319 09:23:34.715282 13205 flags.go:64] FLAG: --config-dir="" Mar 19 09:23:34.715335 master-0 kubenswrapper[13205]: I0319 09:23:34.715326 13205 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 09:23:34.715381 master-0 kubenswrapper[13205]: I0319 09:23:34.715371 13205 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 09:23:34.715425 master-0 kubenswrapper[13205]: I0319 09:23:34.715417 13205 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 09:23:34.715477 master-0 kubenswrapper[13205]: I0319 09:23:34.715469 13205 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 09:23:34.715557 master-0 kubenswrapper[13205]: I0319 09:23:34.715546 13205 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 09:23:34.715630 master-0 kubenswrapper[13205]: I0319 09:23:34.715619 13205 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 09:23:34.715700 master-0 kubenswrapper[13205]: I0319 09:23:34.715688 13205 flags.go:64] FLAG: --contention-profiling="false" Mar 19 09:23:34.715766 master-0 kubenswrapper[13205]: I0319 09:23:34.715756 13205 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 09:23:34.715816 master-0 kubenswrapper[13205]: I0319 09:23:34.715807 13205 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 09:23:34.715860 master-0 kubenswrapper[13205]: I0319 09:23:34.715852 13205 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 09:23:34.715926 master-0 kubenswrapper[13205]: I0319 09:23:34.715913 13205 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 09:23:34.715982 master-0 kubenswrapper[13205]: I0319 09:23:34.715972 13205 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 09:23:34.716030 master-0 kubenswrapper[13205]: I0319 09:23:34.716022 13205 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 09:23:34.716095 master-0 kubenswrapper[13205]: I0319 09:23:34.716084 13205 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 09:23:34.716147 master-0 kubenswrapper[13205]: I0319 09:23:34.716138 13205 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 09:23:34.716197 master-0 kubenswrapper[13205]: I0319 09:23:34.716188 13205 flags.go:64] FLAG: --enable-server="true" Mar 19 09:23:34.716268 master-0 kubenswrapper[13205]: I0319 09:23:34.716258 13205 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 09:23:34.716328 master-0 kubenswrapper[13205]: I0319 09:23:34.716319 13205 flags.go:64] FLAG: --event-burst="100" Mar 19 09:23:34.716384 master-0 kubenswrapper[13205]: I0319 09:23:34.716375 13205 flags.go:64] FLAG: --event-qps="50" Mar 19 09:23:34.716431 master-0 kubenswrapper[13205]: I0319 09:23:34.716423 13205 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 09:23:34.716486 master-0 kubenswrapper[13205]: I0319 09:23:34.716478 13205 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 09:23:34.716560 master-0 kubenswrapper[13205]: I0319 09:23:34.716531 13205 flags.go:64] FLAG: --eviction-hard="" Mar 19 09:23:34.716654 master-0 kubenswrapper[13205]: I0319 09:23:34.716639 13205 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 09:23:34.716711 master-0 kubenswrapper[13205]: I0319 09:23:34.716702 13205 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 09:23:34.716764 master-0 kubenswrapper[13205]: I0319 09:23:34.716756 13205 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 09:23:34.716814 master-0 kubenswrapper[13205]: I0319 09:23:34.716806 13205 flags.go:64] FLAG: --eviction-soft="" Mar 19 09:23:34.716874 master-0 kubenswrapper[13205]: I0319 09:23:34.716864 13205 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 09:23:34.716922 master-0 kubenswrapper[13205]: I0319 09:23:34.716913 13205 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 09:23:34.716971 master-0 kubenswrapper[13205]: I0319 09:23:34.716963 13205 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 09:23:34.717020 master-0 kubenswrapper[13205]: I0319 09:23:34.717012 13205 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 09:23:34.717070 master-0 kubenswrapper[13205]: I0319 09:23:34.717061 13205 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 09:23:34.717119 master-0 kubenswrapper[13205]: I0319 09:23:34.717111 13205 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 09:23:34.717171 master-0 kubenswrapper[13205]: I0319 09:23:34.717161 13205 flags.go:64] FLAG: --feature-gates="" Mar 19 09:23:34.717227 master-0 kubenswrapper[13205]: I0319 09:23:34.717218 13205 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 09:23:34.717279 master-0 kubenswrapper[13205]: I0319 09:23:34.717271 13205 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 09:23:34.717326 master-0 kubenswrapper[13205]: I0319 09:23:34.717318 13205 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 09:23:34.717371 master-0 kubenswrapper[13205]: I0319 09:23:34.717363 13205 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 09:23:34.717420 master-0 kubenswrapper[13205]: I0319 09:23:34.717411 13205 flags.go:64] FLAG: --healthz-port="10248" Mar 19 09:23:34.717468 master-0 kubenswrapper[13205]: I0319 09:23:34.717460 13205 flags.go:64] FLAG: --help="false" Mar 19 09:23:34.717524 master-0 kubenswrapper[13205]: I0319 09:23:34.717512 13205 flags.go:64] FLAG: --hostname-override="" Mar 19 09:23:34.717604 master-0 kubenswrapper[13205]: I0319 09:23:34.717594 13205 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 09:23:34.717652 master-0 kubenswrapper[13205]: I0319 09:23:34.717643 13205 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 09:23:34.717712 master-0 kubenswrapper[13205]: I0319 09:23:34.717702 13205 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 09:23:34.717774 master-0 kubenswrapper[13205]: I0319 09:23:34.717764 13205 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 09:23:34.717828 master-0 kubenswrapper[13205]: I0319 09:23:34.717819 13205 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 09:23:34.717876 master-0 kubenswrapper[13205]: I0319 09:23:34.717868 13205 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 09:23:34.717924 master-0 kubenswrapper[13205]: I0319 09:23:34.717916 13205 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 09:23:34.717980 master-0 kubenswrapper[13205]: I0319 09:23:34.717972 13205 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 09:23:34.718027 master-0 kubenswrapper[13205]: I0319 09:23:34.718019 13205 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 09:23:34.718073 master-0 kubenswrapper[13205]: I0319 09:23:34.718064 13205 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 09:23:34.718120 master-0 kubenswrapper[13205]: I0319 09:23:34.718113 13205 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 09:23:34.718168 master-0 kubenswrapper[13205]: I0319 09:23:34.718160 13205 flags.go:64] FLAG: --kube-reserved="" Mar 19 09:23:34.718220 master-0 kubenswrapper[13205]: I0319 09:23:34.718212 13205 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 09:23:34.718272 master-0 kubenswrapper[13205]: I0319 09:23:34.718264 13205 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 09:23:34.718340 master-0 kubenswrapper[13205]: I0319 09:23:34.718324 13205 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 09:23:34.718414 master-0 kubenswrapper[13205]: I0319 09:23:34.718402 13205 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 09:23:34.718482 master-0 kubenswrapper[13205]: I0319 09:23:34.718472 13205 flags.go:64] FLAG: --lock-file="" Mar 19 09:23:34.718552 master-0 kubenswrapper[13205]: I0319 09:23:34.718529 13205 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 09:23:34.718618 master-0 kubenswrapper[13205]: I0319 09:23:34.718609 13205 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 09:23:34.718678 master-0 kubenswrapper[13205]: I0319 09:23:34.718664 13205 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 09:23:34.718745 master-0 kubenswrapper[13205]: I0319 09:23:34.718734 13205 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 09:23:34.718798 master-0 kubenswrapper[13205]: I0319 09:23:34.718790 13205 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 09:23:34.718848 master-0 kubenswrapper[13205]: I0319 09:23:34.718839 13205 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 09:23:34.718895 master-0 kubenswrapper[13205]: I0319 09:23:34.718887 13205 flags.go:64] FLAG: --logging-format="text" Mar 19 09:23:34.718956 master-0 kubenswrapper[13205]: I0319 09:23:34.718946 13205 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 09:23:34.719021 master-0 kubenswrapper[13205]: I0319 09:23:34.719011 13205 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 09:23:34.719073 master-0 kubenswrapper[13205]: I0319 09:23:34.719065 13205 flags.go:64] FLAG: --manifest-url="" Mar 19 09:23:34.719124 master-0 kubenswrapper[13205]: I0319 09:23:34.719114 13205 flags.go:64] FLAG: --manifest-url-header="" Mar 19 09:23:34.719177 master-0 kubenswrapper[13205]: I0319 09:23:34.719168 13205 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 09:23:34.719227 master-0 kubenswrapper[13205]: I0319 09:23:34.719217 13205 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 09:23:34.719276 master-0 kubenswrapper[13205]: I0319 09:23:34.719268 13205 flags.go:64] FLAG: --max-pods="110" Mar 19 09:23:34.719320 master-0 kubenswrapper[13205]: I0319 09:23:34.719312 13205 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 09:23:34.719364 master-0 kubenswrapper[13205]: I0319 09:23:34.719356 13205 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 09:23:34.719413 master-0 kubenswrapper[13205]: I0319 09:23:34.719405 13205 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 09:23:34.719457 master-0 kubenswrapper[13205]: I0319 09:23:34.719449 13205 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 09:23:34.719526 master-0 kubenswrapper[13205]: I0319 09:23:34.719510 13205 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 09:23:34.719629 master-0 kubenswrapper[13205]: I0319 09:23:34.719613 13205 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 19 09:23:34.719712 master-0 kubenswrapper[13205]: I0319 09:23:34.719689 13205 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 09:23:34.719765 master-0 kubenswrapper[13205]: I0319 09:23:34.719757 13205 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 09:23:34.719814 master-0 kubenswrapper[13205]: I0319 09:23:34.719806 13205 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 09:23:34.719867 master-0 kubenswrapper[13205]: I0319 09:23:34.719856 13205 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 09:23:34.719929 master-0 kubenswrapper[13205]: I0319 09:23:34.719918 13205 flags.go:64] FLAG: --pod-cidr="" Mar 19 09:23:34.719999 master-0 kubenswrapper[13205]: I0319 09:23:34.719985 13205 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 19 09:23:34.720051 master-0 kubenswrapper[13205]: I0319 09:23:34.720043 13205 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 09:23:34.720109 master-0 kubenswrapper[13205]: I0319 09:23:34.720098 13205 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 09:23:34.720176 master-0 kubenswrapper[13205]: I0319 09:23:34.720165 13205 flags.go:64] FLAG: --pods-per-core="0" Mar 19 09:23:34.720243 master-0 kubenswrapper[13205]: I0319 09:23:34.720232 13205 flags.go:64] FLAG: --port="10250" Mar 19 09:23:34.720328 master-0 kubenswrapper[13205]: I0319 09:23:34.720316 13205 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 09:23:34.720404 master-0 kubenswrapper[13205]: I0319 09:23:34.720392 13205 flags.go:64] FLAG: --provider-id="" Mar 19 09:23:34.720476 master-0 kubenswrapper[13205]: I0319 09:23:34.720465 13205 flags.go:64] FLAG: --qos-reserved="" Mar 19 09:23:34.720558 master-0 kubenswrapper[13205]: I0319 09:23:34.720531 13205 flags.go:64] FLAG: --read-only-port="10255" Mar 19 09:23:34.720631 master-0 kubenswrapper[13205]: I0319 09:23:34.720621 13205 flags.go:64] FLAG: --register-node="true" Mar 19 09:23:34.720683 master-0 kubenswrapper[13205]: I0319 09:23:34.720675 13205 flags.go:64] FLAG: --register-schedulable="true" Mar 19 09:23:34.720737 master-0 kubenswrapper[13205]: I0319 09:23:34.720724 13205 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 09:23:34.720790 master-0 kubenswrapper[13205]: I0319 09:23:34.720781 13205 flags.go:64] FLAG: --registry-burst="10" Mar 19 09:23:34.720835 master-0 kubenswrapper[13205]: I0319 09:23:34.720827 13205 flags.go:64] FLAG: --registry-qps="5" Mar 19 09:23:34.720888 master-0 kubenswrapper[13205]: I0319 09:23:34.720878 13205 flags.go:64] FLAG: --reserved-cpus="" Mar 19 09:23:34.720955 master-0 kubenswrapper[13205]: I0319 09:23:34.720942 13205 flags.go:64] FLAG: --reserved-memory="" Mar 19 09:23:34.721008 master-0 kubenswrapper[13205]: I0319 09:23:34.720999 13205 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 09:23:34.721054 master-0 kubenswrapper[13205]: I0319 09:23:34.721046 13205 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 09:23:34.721103 master-0 kubenswrapper[13205]: I0319 09:23:34.721095 13205 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 09:23:34.721148 master-0 kubenswrapper[13205]: I0319 09:23:34.721140 13205 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 09:23:34.721235 master-0 kubenswrapper[13205]: I0319 09:23:34.721225 13205 flags.go:64] FLAG: --runonce="false" Mar 19 09:23:34.721352 master-0 kubenswrapper[13205]: I0319 09:23:34.721342 13205 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 09:23:34.721404 master-0 kubenswrapper[13205]: I0319 09:23:34.721395 13205 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 09:23:34.721454 master-0 kubenswrapper[13205]: I0319 09:23:34.721446 13205 flags.go:64] FLAG: --seccomp-default="false" Mar 19 09:23:34.721502 master-0 kubenswrapper[13205]: I0319 09:23:34.721494 13205 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 09:23:34.721650 master-0 kubenswrapper[13205]: I0319 09:23:34.721633 13205 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 09:23:34.721711 master-0 kubenswrapper[13205]: I0319 09:23:34.721702 13205 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 09:23:34.721762 master-0 kubenswrapper[13205]: I0319 09:23:34.721753 13205 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 09:23:34.721814 master-0 kubenswrapper[13205]: I0319 09:23:34.721805 13205 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 09:23:34.721859 master-0 kubenswrapper[13205]: I0319 09:23:34.721851 13205 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 09:23:34.721907 master-0 kubenswrapper[13205]: I0319 09:23:34.721899 13205 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 09:23:34.721955 master-0 kubenswrapper[13205]: I0319 09:23:34.721947 13205 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 09:23:34.722000 master-0 kubenswrapper[13205]: I0319 09:23:34.721992 13205 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 09:23:34.722048 master-0 kubenswrapper[13205]: I0319 09:23:34.722040 13205 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 09:23:34.722096 master-0 kubenswrapper[13205]: I0319 09:23:34.722088 13205 flags.go:64] FLAG: --system-cgroups="" Mar 19 09:23:34.722149 master-0 kubenswrapper[13205]: I0319 09:23:34.722137 13205 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 19 09:23:34.722201 master-0 kubenswrapper[13205]: I0319 09:23:34.722193 13205 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 09:23:34.722246 master-0 kubenswrapper[13205]: I0319 09:23:34.722239 13205 flags.go:64] FLAG: --tls-cert-file="" Mar 19 09:23:34.722298 master-0 kubenswrapper[13205]: I0319 09:23:34.722288 13205 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 09:23:34.722347 master-0 kubenswrapper[13205]: I0319 09:23:34.722339 13205 flags.go:64] FLAG: --tls-min-version="" Mar 19 09:23:34.722392 master-0 kubenswrapper[13205]: I0319 09:23:34.722384 13205 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 09:23:34.722439 master-0 kubenswrapper[13205]: I0319 09:23:34.722431 13205 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 09:23:34.722488 master-0 kubenswrapper[13205]: I0319 09:23:34.722479 13205 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 09:23:34.722560 master-0 kubenswrapper[13205]: I0319 09:23:34.722550 13205 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 09:23:34.722610 master-0 kubenswrapper[13205]: I0319 09:23:34.722600 13205 flags.go:64] FLAG: --v="2" Mar 19 09:23:34.722670 master-0 kubenswrapper[13205]: I0319 09:23:34.722659 13205 flags.go:64] FLAG: --version="false" Mar 19 09:23:34.722718 master-0 kubenswrapper[13205]: I0319 09:23:34.722708 13205 flags.go:64] FLAG: --vmodule="" Mar 19 09:23:34.722764 master-0 kubenswrapper[13205]: I0319 09:23:34.722755 13205 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 09:23:34.722814 master-0 kubenswrapper[13205]: I0319 09:23:34.722805 13205 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 09:23:34.723060 master-0 kubenswrapper[13205]: W0319 09:23:34.723049 13205 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:23:34.723122 master-0 kubenswrapper[13205]: W0319 09:23:34.723114 13205 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:23:34.723171 master-0 kubenswrapper[13205]: W0319 09:23:34.723164 13205 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:23:34.723217 master-0 kubenswrapper[13205]: W0319 09:23:34.723210 13205 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:23:34.723277 master-0 kubenswrapper[13205]: W0319 09:23:34.723267 13205 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:23:34.723348 master-0 kubenswrapper[13205]: W0319 09:23:34.723339 13205 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:23:34.723400 master-0 kubenswrapper[13205]: W0319 09:23:34.723392 13205 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:23:34.723455 master-0 kubenswrapper[13205]: W0319 09:23:34.723445 13205 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:23:34.723512 master-0 kubenswrapper[13205]: W0319 09:23:34.723504 13205 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:23:34.723586 master-0 kubenswrapper[13205]: W0319 09:23:34.723577 13205 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:23:34.723649 master-0 kubenswrapper[13205]: W0319 09:23:34.723641 13205 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:23:34.723700 master-0 kubenswrapper[13205]: W0319 09:23:34.723692 13205 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:23:34.723745 master-0 kubenswrapper[13205]: W0319 09:23:34.723737 13205 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:23:34.723788 master-0 kubenswrapper[13205]: W0319 09:23:34.723781 13205 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:23:34.723836 master-0 kubenswrapper[13205]: W0319 09:23:34.723829 13205 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:23:34.723891 master-0 kubenswrapper[13205]: W0319 09:23:34.723883 13205 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:23:34.723936 master-0 kubenswrapper[13205]: W0319 09:23:34.723928 13205 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:23:34.724029 master-0 kubenswrapper[13205]: W0319 09:23:34.724020 13205 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:23:34.724074 master-0 kubenswrapper[13205]: W0319 09:23:34.724067 13205 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:23:34.724117 master-0 kubenswrapper[13205]: W0319 09:23:34.724110 13205 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:23:34.724172 master-0 kubenswrapper[13205]: W0319 09:23:34.724164 13205 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:23:34.724217 master-0 kubenswrapper[13205]: W0319 09:23:34.724209 13205 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:23:34.724270 master-0 kubenswrapper[13205]: W0319 09:23:34.724261 13205 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:23:34.724356 master-0 kubenswrapper[13205]: W0319 09:23:34.724345 13205 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:23:34.724427 master-0 kubenswrapper[13205]: W0319 09:23:34.724417 13205 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:23:34.724487 master-0 kubenswrapper[13205]: W0319 09:23:34.724479 13205 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:23:34.724566 master-0 kubenswrapper[13205]: W0319 09:23:34.724556 13205 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:23:34.724631 master-0 kubenswrapper[13205]: W0319 09:23:34.724622 13205 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:23:34.724683 master-0 kubenswrapper[13205]: W0319 09:23:34.724675 13205 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724730 13205 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724738 13205 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724742 13205 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724747 13205 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724751 13205 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724755 13205 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724760 13205 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724765 13205 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724769 13205 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724773 13205 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724777 13205 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724784 13205 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724789 13205 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724794 13205 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724799 13205 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724803 13205 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724808 13205 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724812 13205 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724816 13205 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:23:34.725091 master-0 kubenswrapper[13205]: W0319 09:23:34.724821 13205 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724826 13205 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724831 13205 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724836 13205 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724840 13205 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724845 13205 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724849 13205 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724855 13205 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724861 13205 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724866 13205 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724871 13205 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724876 13205 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724880 13205 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724887 13205 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724893 13205 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724897 13205 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724902 13205 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724906 13205 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724913 13205 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:23:34.725801 master-0 kubenswrapper[13205]: W0319 09:23:34.724918 13205 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:23:34.726348 master-0 kubenswrapper[13205]: W0319 09:23:34.724923 13205 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:23:34.726348 master-0 kubenswrapper[13205]: W0319 09:23:34.724928 13205 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:23:34.726348 master-0 kubenswrapper[13205]: W0319 09:23:34.724933 13205 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:23:34.726348 master-0 kubenswrapper[13205]: W0319 09:23:34.724937 13205 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:23:34.726348 master-0 kubenswrapper[13205]: I0319 09:23:34.724944 13205 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:23:34.734059 master-0 kubenswrapper[13205]: I0319 09:23:34.732689 13205 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 19 09:23:34.734059 master-0 kubenswrapper[13205]: I0319 09:23:34.733268 13205 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734114 13205 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734125 13205 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734129 13205 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734133 13205 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734137 13205 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734140 13205 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734144 13205 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734148 13205 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734152 13205 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734155 13205 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734159 13205 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734163 13205 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734167 13205 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734170 13205 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734174 13205 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734177 13205 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734181 13205 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734185 13205 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734189 13205 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:23:34.734227 master-0 kubenswrapper[13205]: W0319 09:23:34.734192 13205 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734196 13205 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734199 13205 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734203 13205 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734207 13205 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734210 13205 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734214 13205 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734218 13205 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734224 13205 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734231 13205 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734236 13205 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734242 13205 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734248 13205 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734253 13205 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734258 13205 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734262 13205 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734267 13205 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734270 13205 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734274 13205 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:23:34.734895 master-0 kubenswrapper[13205]: W0319 09:23:34.734278 13205 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734283 13205 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734289 13205 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734293 13205 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734297 13205 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734360 13205 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734368 13205 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734373 13205 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734377 13205 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734382 13205 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734388 13205 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734393 13205 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734397 13205 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734405 13205 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734410 13205 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734416 13205 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734421 13205 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734425 13205 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734430 13205 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:23:34.735796 master-0 kubenswrapper[13205]: W0319 09:23:34.734435 13205 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734440 13205 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734446 13205 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734451 13205 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734456 13205 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734461 13205 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734465 13205 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734470 13205 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734475 13205 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734533 13205 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734541 13205 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734546 13205 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734551 13205 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734555 13205 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: W0319 09:23:34.734560 13205 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:23:34.736315 master-0 kubenswrapper[13205]: I0319 09:23:34.734568 13205 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734740 13205 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734757 13205 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734764 13205 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734769 13205 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734774 13205 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734779 13205 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734785 13205 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734789 13205 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734794 13205 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734799 13205 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734804 13205 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734809 13205 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734815 13205 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734819 13205 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734824 13205 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734828 13205 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734833 13205 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734837 13205 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734842 13205 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:23:34.736831 master-0 kubenswrapper[13205]: W0319 09:23:34.734847 13205 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734851 13205 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734856 13205 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734861 13205 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734865 13205 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734901 13205 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734906 13205 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734910 13205 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734914 13205 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734917 13205 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734921 13205 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734925 13205 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734929 13205 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734933 13205 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734938 13205 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734943 13205 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734947 13205 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734951 13205 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734955 13205 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734959 13205 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:23:34.737367 master-0 kubenswrapper[13205]: W0319 09:23:34.734963 13205 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.734966 13205 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.734970 13205 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.734975 13205 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.734979 13205 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.734984 13205 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.734989 13205 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.734993 13205 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.734998 13205 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.735003 13205 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.735008 13205 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.735013 13205 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.735017 13205 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.735022 13205 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.735027 13205 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.735032 13205 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.735037 13205 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.735041 13205 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.735046 13205 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.735056 13205 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:23:34.737916 master-0 kubenswrapper[13205]: W0319 09:23:34.735063 13205 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: W0319 09:23:34.735067 13205 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: W0319 09:23:34.735072 13205 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: W0319 09:23:34.735078 13205 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: W0319 09:23:34.735085 13205 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: W0319 09:23:34.735090 13205 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: W0319 09:23:34.735094 13205 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: W0319 09:23:34.735104 13205 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: W0319 09:23:34.735113 13205 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: W0319 09:23:34.735120 13205 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: W0319 09:23:34.735125 13205 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: W0319 09:23:34.735130 13205 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: W0319 09:23:34.735135 13205 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: I0319 09:23:34.735143 13205 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:23:34.738467 master-0 kubenswrapper[13205]: I0319 09:23:34.735358 13205 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 09:23:34.739036 master-0 kubenswrapper[13205]: I0319 09:23:34.737354 13205 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 19 09:23:34.739036 master-0 kubenswrapper[13205]: I0319 09:23:34.737596 13205 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 09:23:34.739036 master-0 kubenswrapper[13205]: I0319 09:23:34.738132 13205 server.go:997] "Starting client certificate rotation" Mar 19 09:23:34.739036 master-0 kubenswrapper[13205]: I0319 09:23:34.738228 13205 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 09:23:34.739036 master-0 kubenswrapper[13205]: I0319 09:23:34.738414 13205 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-20 09:08:48 +0000 UTC, rotation deadline is 2026-03-20 03:17:11.527090972 +0000 UTC Mar 19 09:23:34.739036 master-0 kubenswrapper[13205]: I0319 09:23:34.738476 13205 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h53m36.788618417s for next certificate rotation Mar 19 09:23:34.739210 master-0 kubenswrapper[13205]: I0319 09:23:34.739162 13205 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:23:34.740725 master-0 kubenswrapper[13205]: I0319 09:23:34.740696 13205 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:23:34.743901 master-0 kubenswrapper[13205]: I0319 09:23:34.743864 13205 log.go:25] "Validated CRI v1 runtime API" Mar 19 09:23:34.747044 master-0 kubenswrapper[13205]: I0319 09:23:34.747017 13205 log.go:25] "Validated CRI v1 image API" Mar 19 09:23:34.747799 master-0 kubenswrapper[13205]: I0319 09:23:34.747771 13205 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 09:23:34.754811 master-0 kubenswrapper[13205]: I0319 09:23:34.754699 13205 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 aae93335-158a-444f-870b-34679824b626:/dev/vda3] Mar 19 09:23:34.755847 master-0 kubenswrapper[13205]: I0319 09:23:34.754752 13205 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0865b3dd8e414cf36fa73f2f26a1125029b6401943086385aaef6e6adbd387e7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0865b3dd8e414cf36fa73f2f26a1125029b6401943086385aaef6e6adbd387e7/userdata/shm major:0 minor:267 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0ee32bb670dc76513805d1b62d5fffdb198c07008f45dbefa73a8b74cfb40229/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0ee32bb670dc76513805d1b62d5fffdb198c07008f45dbefa73a8b74cfb40229/userdata/shm major:0 minor:249 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1132a72c9136c6d33d6382355fa3991b260d8a3776fc503599fe4ecedb8985b2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1132a72c9136c6d33d6382355fa3991b260d8a3776fc503599fe4ecedb8985b2/userdata/shm major:0 minor:73 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1529b41a10d1658d384e0b7a36c11f0035fc8f768b5a9de54629908bbe77762e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1529b41a10d1658d384e0b7a36c11f0035fc8f768b5a9de54629908bbe77762e/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1885558dee49f6f6ad4666eff4afb57c213620724cc5285f30bbd5409ae9582e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1885558dee49f6f6ad4666eff4afb57c213620724cc5285f30bbd5409ae9582e/userdata/shm major:0 minor:253 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1c57ea8e09d1325e300d649233ce1b17315dc34efabdbd2fd35d3a0b5c00a757/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1c57ea8e09d1325e300d649233ce1b17315dc34efabdbd2fd35d3a0b5c00a757/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/202f921c55b75970ad21b44a2165cf9cf2366346959189388624b5cff168cafb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/202f921c55b75970ad21b44a2165cf9cf2366346959189388624b5cff168cafb/userdata/shm major:0 minor:548 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2451d7e3dd79303504d5964f5bc9fe498e3fce32e9bf236a0e1ab73d89c4fa39/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2451d7e3dd79303504d5964f5bc9fe498e3fce32e9bf236a0e1ab73d89c4fa39/userdata/shm major:0 minor:556 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2667f8abc3377f4e949ca5efee8caf3d44c08b3911b024266dc76fb9003cb2e0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2667f8abc3377f4e949ca5efee8caf3d44c08b3911b024266dc76fb9003cb2e0/userdata/shm major:0 minor:442 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2702ffe2a7096514b0cf147d61b08f45ac487590697d47b826f39e03c4994a7d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2702ffe2a7096514b0cf147d61b08f45ac487590697d47b826f39e03c4994a7d/userdata/shm major:0 minor:366 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2c994266a2d02f0511c56e203e02c66ec993c8a4956cebe37152ed3179a4c4ff/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2c994266a2d02f0511c56e203e02c66ec993c8a4956cebe37152ed3179a4c4ff/userdata/shm major:0 minor:558 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2e855eb091c3c19e500b45068cb0c0c879f09feb5f816f86f9d1253a9f1c5dcd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2e855eb091c3c19e500b45068cb0c0c879f09feb5f816f86f9d1253a9f1c5dcd/userdata/shm major:0 minor:373 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/310348963a49f41d34871a4c0d732a2191aaea2d2db0ebbe19d1390098835ced/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/310348963a49f41d34871a4c0d732a2191aaea2d2db0ebbe19d1390098835ced/userdata/shm major:0 minor:259 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3d5b3f08e9980af7a4eb46a62a5af4211db365f64342fe1705f26fa41b7b1331/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3d5b3f08e9980af7a4eb46a62a5af4211db365f64342fe1705f26fa41b7b1331/userdata/shm major:0 minor:551 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3e14d5393be022eff24be7e8d5e671dc610671f728796a2cb5a2309e1895b5f0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3e14d5393be022eff24be7e8d5e671dc610671f728796a2cb5a2309e1895b5f0/userdata/shm major:0 minor:476 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3eefb4e7e53cb0d4c4a18064ef7d910510d0608a89ca096908a4ffccd0aaebda/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3eefb4e7e53cb0d4c4a18064ef7d910510d0608a89ca096908a4ffccd0aaebda/userdata/shm major:0 minor:683 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/44d9515c76d2b5369510d3737c2fe1814c5099a9199ebffb839eb4e657e0735e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/44d9515c76d2b5369510d3737c2fe1814c5099a9199ebffb839eb4e657e0735e/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4e6b0fbcf10efb9d378e7013c9ee95c7eea5f13187283f4e3dcc1192d68f1166/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4e6b0fbcf10efb9d378e7013c9ee95c7eea5f13187283f4e3dcc1192d68f1166/userdata/shm major:0 minor:552 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5001c0304645acfa799077998786ddfe7d90e702ba8e83ddc5ed0850af9bd30d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5001c0304645acfa799077998786ddfe7d90e702ba8e83ddc5ed0850af9bd30d/userdata/shm major:0 minor:557 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/50cc5384c1b8bb903ef5215671baf6cb4c6d2ce7a00d389208992a278c3b103c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/50cc5384c1b8bb903ef5215671baf6cb4c6d2ce7a00d389208992a278c3b103c/userdata/shm major:0 minor:447 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/58a6496fefda9dc10f2cd3d711f675ec3a41cf0c8719af9244e86cc4f0694683/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/58a6496fefda9dc10f2cd3d711f675ec3a41cf0c8719af9244e86cc4f0694683/userdata/shm major:0 minor:247 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5f0b7606a412dcca4dd370553910b12ad443e3587ee9a8d70a1100b889c51bbc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5f0b7606a412dcca4dd370553910b12ad443e3587ee9a8d70a1100b889c51bbc/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/68534953ce4f9d64b4ca25577e4617ff34537dd8175ec1c79125e169063bd6f3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/68534953ce4f9d64b4ca25577e4617ff34537dd8175ec1c79125e169063bd6f3/userdata/shm major:0 minor:547 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6bf8f167b730f8b123fa119481aeceac0bccae7e125576f133fb9531cd659c54/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6bf8f167b730f8b123fa119481aeceac0bccae7e125576f133fb9531cd659c54/userdata/shm major:0 minor:685 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/72c2a8691354c4c557c4f66cfa7db93075f810bfaffc8fba5e2d6aab857f58a8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/72c2a8691354c4c557c4f66cfa7db93075f810bfaffc8fba5e2d6aab857f58a8/userdata/shm major:0 minor:269 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/76a2be65b345aaa03d42847ddf4106be40d256a72f66630810b64aeb72f9c081/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/76a2be65b345aaa03d42847ddf4106be40d256a72f66630810b64aeb72f9c081/userdata/shm major:0 minor:52 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7cd4cfeb35d9cb7e8bc213abe4e5f2a9ecc0b4807e7e9244214faaeba9632ab5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7cd4cfeb35d9cb7e8bc213abe4e5f2a9ecc0b4807e7e9244214faaeba9632ab5/userdata/shm major:0 minor:644 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/813a77628cdb690ef9ed760c21cb05d1f17fab6329f59eb55493fe5e4d55f0d3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/813a77628cdb690ef9ed760c21cb05d1f17fab6329f59eb55493fe5e4d55f0d3/userdata/shm major:0 minor:75 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/82537ef54e6382f32d5fe80aff3875a880fc715c44848fde8e9d22a20125f223/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/82537ef54e6382f32d5fe80aff3875a880fc715c44848fde8e9d22a20125f223/userdata/shm major:0 minor:499 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8bbb7eb717a10731a76fbab7e75a4760990dac18f169f5c55d4ff290082a576b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8bbb7eb717a10731a76fbab7e75a4760990dac18f169f5c55d4ff290082a576b/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8f416a5a9dd7c863825858501cc1e0dcef058160b507e9a5e5d82fab9e9dd0c1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8f416a5a9dd7c863825858501cc1e0dcef058160b507e9a5e5d82fab9e9dd0c1/userdata/shm major:0 minor:105 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/915745b7520cddc649a8755f21f9aee4e835c7048a1ccbf6eea3461a98982a5e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/915745b7520cddc649a8755f21f9aee4e835c7048a1ccbf6eea3461a98982a5e/userdata/shm major:0 minor:91 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/944648f39111fd7c1a6ed081666cf0303ca2a6eb595623e82619c7478d3372ab/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/944648f39111fd7c1a6ed081666cf0303ca2a6eb595623e82619c7478d3372ab/userdata/shm major:0 minor:265 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/963f71e764d046880085fa5f09ddf4d6f88636354e79d8ab2e64d52ec74b74ae/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/963f71e764d046880085fa5f09ddf4d6f88636354e79d8ab2e64d52ec74b74ae/userdata/shm major:0 minor:257 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9afd6ea2c1d8f05e8e4fc03f47178ac0a2f4931512d72e1dd34b6edbe52cf174/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9afd6ea2c1d8f05e8e4fc03f47178ac0a2f4931512d72e1dd34b6edbe52cf174/userdata/shm major:0 minor:109 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9c83b15dba69144ea146a0efbf84c34a34a7bbb646a98c775e5d8f6252c9784a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9c83b15dba69144ea146a0efbf84c34a34a7bbb646a98c775e5d8f6252c9784a/userdata/shm major:0 minor:459 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9e6502d00c4d560279a6b84e0eac2864639061d852a900f00e6d52ff81453134/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9e6502d00c4d560279a6b84e0eac2864639061d852a900f00e6d52ff81453134/userdata/shm major:0 minor:559 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a68ad4116cab88705ddf2fb479c6fa07f6cc567a78a2d33208b00017ebb5225f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a68ad4116cab88705ddf2fb479c6fa07f6cc567a78a2d33208b00017ebb5225f/userdata/shm major:0 minor:263 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b34e9a2b33556321559c3a7fd34bd69e2a162921e3a485dc8edf1c710c34dfa7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b34e9a2b33556321559c3a7fd34bd69e2a162921e3a485dc8edf1c710c34dfa7/userdata/shm major:0 minor:56 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c09069a2109d448b73c63e5e3d2a41051b8198531ad6e6a692843369313b17a8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c09069a2109d448b73c63e5e3d2a41051b8198531ad6e6a692843369313b17a8/userdata/shm major:0 minor:727 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c604e07b23c824fe44edd155fd3bcc4d87de07b9af516a6fc04d64e9a7ef4a11/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c604e07b23c824fe44edd155fd3bcc4d87de07b9af516a6fc04d64e9a7ef4a11/userdata/shm major:0 minor:554 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c67ba3f4e9bb95eef468edeb24c18cd6982feefa1823f748db64378aa999c140/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c67ba3f4e9bb95eef468edeb24c18cd6982feefa1823f748db64378aa999c140/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c8885dfa43b9e4c2a58db6e5ff12c1dfdfe9193837daeb55173993661ea9f46a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c8885dfa43b9e4c2a58db6e5ff12c1dfdfe9193837daeb55173993661ea9f46a/userdata/shm major:0 minor:251 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d146d705aef07c62cae684b04e48b6f5db2109ae5a19f2d427637f8db5f61221/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d146d705aef07c62cae684b04e48b6f5db2109ae5a19f2d427637f8db5f61221/userdata/shm major:0 minor:421 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d6cda39585354e47346ec04d7e9023161d8c669dfe02492069483d076fdb9801/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d6cda39585354e47346ec04d7e9023161d8c669dfe02492069483d076fdb9801/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d9034c9252b2dbe49fa20bf241af605c2b9efd4ec2d903f7338b331b9a335a60/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d9034c9252b2dbe49fa20bf241af605c2b9efd4ec2d903f7338b331b9a335a60/userdata/shm major:0 minor:550 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/db26592d73a7736c3517fdc2847e1043ee77738e2bbfdfcabaf0fa7701a43b04/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/db26592d73a7736c3517fdc2847e1043ee77738e2bbfdfcabaf0fa7701a43b04/userdata/shm major:0 minor:379 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e3a470e3bacc4ee90522d655c1cb49f2266b41a208ae2967afd423c830e462e3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e3a470e3bacc4ee90522d655c1cb49f2266b41a208ae2967afd423c830e462e3/userdata/shm major:0 minor:255 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ecba3762dd2d103496ed9fed52be51c550935d62b9dab4b76da7f92f8e0395b8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ecba3762dd2d103496ed9fed52be51c550935d62b9dab4b76da7f92f8e0395b8/userdata/shm major:0 minor:318 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ef86c160aaf4ed6a2febd660641341c71096c2c568217ab433cd656af3876942/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ef86c160aaf4ed6a2febd660641341c71096c2c568217ab433cd656af3876942/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f1b224e2c3a94acf439ad26c1933da88f3f4e0a2666ac3998bdb5a26f2159e15/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f1b224e2c3a94acf439ad26c1933da88f3f4e0a2666ac3998bdb5a26f2159e15/userdata/shm major:0 minor:460 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f2b06f4e36c66727358cf033020ec300296b581135453e3576489d12e345e41e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f2b06f4e36c66727358cf033020ec300296b581135453e3576489d12e345e41e/userdata/shm major:0 minor:669 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~projected/kube-api-access-6bdnt:{mountpoint:/var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~projected/kube-api-access-6bdnt major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~secret/metrics-tls major:0 minor:507 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0bce9154-cd31-4c4a-9d86-2903d5b1adad/volumes/kubernetes.io~projected/kube-api-access-4kr8w:{mountpoint:/var/lib/kubelet/pods/0bce9154-cd31-4c4a-9d86-2903d5b1adad/volumes/kubernetes.io~projected/kube-api-access-4kr8w major:0 minor:55 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/10c609bb-136a-4ce2-b9e2-0a03e1a37a62/volumes/kubernetes.io~projected/kube-api-access-tpgbq:{mountpoint:/var/lib/kubelet/pods/10c609bb-136a-4ce2-b9e2-0a03e1a37a62/volumes/kubernetes.io~projected/kube-api-access-tpgbq major:0 minor:297 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/13072c08-c77c-4170-9ebe-98d63968747b/volumes/kubernetes.io~projected/kube-api-access-clpb5:{mountpoint:/var/lib/kubelet/pods/13072c08-c77c-4170-9ebe-98d63968747b/volumes/kubernetes.io~projected/kube-api-access-clpb5 major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/13072c08-c77c-4170-9ebe-98d63968747b/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/13072c08-c77c-4170-9ebe-98d63968747b/volumes/kubernetes.io~secret/metrics-certs major:0 minor:546 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/157e3524-eb27-41ca-b49d-2697ee1245ca/volumes/kubernetes.io~projected/kube-api-access-qhzsr:{mountpoint:/var/lib/kubelet/pods/157e3524-eb27-41ca-b49d-2697ee1245ca/volumes/kubernetes.io~projected/kube-api-access-qhzsr major:0 minor:103 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4/volumes/kubernetes.io~projected/kube-api-access-9blbc:{mountpoint:/var/lib/kubelet/pods/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4/volumes/kubernetes.io~projected/kube-api-access-9blbc major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/16d2930b-486b-492d-983e-c6702d8f53a7/volumes/kubernetes.io~projected/kube-api-access-h5hk6:{mountpoint:/var/lib/kubelet/pods/16d2930b-486b-492d-983e-c6702d8f53a7/volumes/kubernetes.io~projected/kube-api-access-h5hk6 major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/16d2930b-486b-492d-983e-c6702d8f53a7/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/16d2930b-486b-492d-983e-c6702d8f53a7/volumes/kubernetes.io~secret/metrics-tls major:0 minor:491 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1d5e311c-1c6a-4d5d-8c2b-493025593934/volumes/kubernetes.io~projected/kube-api-access-49fpz:{mountpoint:/var/lib/kubelet/pods/1d5e311c-1c6a-4d5d-8c2b-493025593934/volumes/kubernetes.io~projected/kube-api-access-49fpz major:0 minor:681 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1d5e311c-1c6a-4d5d-8c2b-493025593934/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/1d5e311c-1c6a-4d5d-8c2b-493025593934/volumes/kubernetes.io~secret/serving-cert major:0 minor:675 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/259794ab-d027-497a-b08e-5a6d79057668/volumes/kubernetes.io~projected/kube-api-access-6v88k:{mountpoint:/var/lib/kubelet/pods/259794ab-d027-497a-b08e-5a6d79057668/volumes/kubernetes.io~projected/kube-api-access-6v88k major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/259794ab-d027-497a-b08e-5a6d79057668/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/259794ab-d027-497a-b08e-5a6d79057668/volumes/kubernetes.io~secret/srv-cert major:0 minor:482 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3374940a-612d-4335-8236-3ffe8d6e73a5/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/3374940a-612d-4335-8236-3ffe8d6e73a5/volumes/kubernetes.io~projected/ca-certs major:0 minor:409 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3374940a-612d-4335-8236-3ffe8d6e73a5/volumes/kubernetes.io~projected/kube-api-access-kmpcn:{mountpoint:/var/lib/kubelet/pods/3374940a-612d-4335-8236-3ffe8d6e73a5/volumes/kubernetes.io~projected/kube-api-access-kmpcn major:0 minor:458 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3374940a-612d-4335-8236-3ffe8d6e73a5/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/3374940a-612d-4335-8236-3ffe8d6e73a5/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:453 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b333a1e-2a7f-423a-8b40-99f30c89f740/volumes/kubernetes.io~projected/kube-api-access-xvd6f:{mountpoint:/var/lib/kubelet/pods/3b333a1e-2a7f-423a-8b40-99f30c89f740/volumes/kubernetes.io~projected/kube-api-access-xvd6f major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b333a1e-2a7f-423a-8b40-99f30c89f740/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3b333a1e-2a7f-423a-8b40-99f30c89f740/volumes/kubernetes.io~secret/serving-cert major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b50118d-f7c2-4bff-aca0-5c6623819baf/volumes/kubernetes.io~projected/kube-api-access-6rqsq:{mountpoint:/var/lib/kubelet/pods/3b50118d-f7c2-4bff-aca0-5c6623819baf/volumes/kubernetes.io~projected/kube-api-access-6rqsq major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b50118d-f7c2-4bff-aca0-5c6623819baf/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/3b50118d-f7c2-4bff-aca0-5c6623819baf/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/41659a48-5eea-41cd-8b2a-b683dc15cc11/volumes/kubernetes.io~projected/kube-api-access-jtw68:{mountpoint:/var/lib/kubelet/pods/41659a48-5eea-41cd-8b2a-b683dc15cc11/volumes/kubernetes.io~projected/kube-api-access-jtw68 major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/41659a48-5eea-41cd-8b2a-b683dc15cc11/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/41659a48-5eea-41cd-8b2a-b683dc15cc11/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/43cb2a3b-40e2-45ee-894a-6c833ee17efd/volumes/kubernetes.io~projected/kube-api-access-vf6dq:{mountpoint:/var/lib/kubelet/pods/43cb2a3b-40e2-45ee-894a-6c833ee17efd/volumes/kubernetes.io~projected/kube-api-access-vf6dq major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/43cb2a3b-40e2-45ee-894a-6c833ee17efd/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/43cb2a3b-40e2-45ee-894a-6c833ee17efd/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4abcf2ea-50f5-4d62-8a23-583438e5b451/volumes/kubernetes.io~projected/kube-api-access-2hnvh:{mountpoint:/var/lib/kubelet/pods/4abcf2ea-50f5-4d62-8a23-583438e5b451/volumes/kubernetes.io~projected/kube-api-access-2hnvh major:0 minor:102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4abcf2ea-50f5-4d62-8a23-583438e5b451/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/4abcf2ea-50f5-4d62-8a23-583438e5b451/volumes/kubernetes.io~secret/metrics-tls major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4d2c5580-36f6-4107-af53-cfbd15080b30/volumes/kubernetes.io~projected/kube-api-access-x6j2m:{mountpoint:/var/lib/kubelet/pods/4d2c5580-36f6-4107-af53-cfbd15080b30/volumes/kubernetes.io~projected/kube-api-access-x6j2m major:0 minor:54 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4f65184f-8fc2-4656-8776-a3b962aa1f5d/volumes/kubernetes.io~projected/kube-api-access-j65pb:{mountpoint:/var/lib/kubelet/pods/4f65184f-8fc2-4656-8776-a3b962aa1f5d/volumes/kubernetes.io~projected/kube-api-access-j65pb major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/51b88818-5108-40db-90c8-4f2e7198959e/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/51b88818-5108-40db-90c8-4f2e7198959e/volumes/kubernetes.io~projected/kube-api-access major:0 minor:104 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/51b88818-5108-40db-90c8-4f2e7198959e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/51b88818-5108-40db-90c8-4f2e7198959e/volumes/kubernetes.io~secret/serving-cert major:0 minor:543 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/56e11aac-d199-404a-a0e2-82c28926746d/volumes/kubernetes.io~projected/kube-api-access-pg4cn:{mountpoint:/var/lib/kubelet/pods/56e11aac-d199-404a-a0e2-82c28926746d/volumes/kubernetes.io~projected/kube-api-access-pg4cn major:0 minor:345 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58ea8fcc-29b2-48ef-8629-2ba217c9d70c/volumes/kubernetes.io~projected/kube-api-access-sfq74:{mountpoint:/var/lib/kubelet/pods/58ea8fcc-29b2-48ef-8629-2ba217c9d70c/volumes/kubernetes.io~projected/kube-api-access-sfq74 major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58ea8fcc-29b2-48ef-8629-2ba217c9d70c/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/58ea8fcc-29b2-48ef-8629-2ba217c9d70c/volumes/kubernetes.io~secret/webhook-cert major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a51c701-7f2a-4332-a301-746e8a0eb475/volumes/kubernetes.io~projected/kube-api-access-g7ppn:{mountpoint:/var/lib/kubelet/pods/5a51c701-7f2a-4332-a301-746e8a0eb475/volumes/kubernetes.io~projected/kube-api-access-g7ppn major:0 minor:415 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a51c701-7f2a-4332-a301-746e8a0eb475/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/5a51c701-7f2a-4332-a301-746e8a0eb475/volumes/kubernetes.io~secret/encryption-config major:0 minor:413 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a51c701-7f2a-4332-a301-746e8a0eb475/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/5a51c701-7f2a-4332-a301-746e8a0eb475/volumes/kubernetes.io~secret/etcd-client major:0 minor:414 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a51c701-7f2a-4332-a301-746e8a0eb475/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5a51c701-7f2a-4332-a301-746e8a0eb475/volumes/kubernetes.io~secret/serving-cert major:0 minor:437 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/64f60856-22dd-4560-acff-c620e17844a1/volumes/kubernetes.io~projected/kube-api-access-cf5jl:{mountpoint:/var/lib/kubelet/pods/64f60856-22dd-4560-acff-c620e17844a1/volumes/kubernetes.io~projected/kube-api-access-cf5jl major:0 minor:407 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/64f60856-22dd-4560-acff-c620e17844a1/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/64f60856-22dd-4560-acff-c620e17844a1/volumes/kubernetes.io~secret/encryption-config major:0 minor:405 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/64f60856-22dd-4560-acff-c620e17844a1/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/64f60856-22dd-4560-acff-c620e17844a1/volumes/kubernetes.io~secret/etcd-client major:0 minor:406 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/64f60856-22dd-4560-acff-c620e17844a1/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/64f60856-22dd-4560-acff-c620e17844a1/volumes/kubernetes.io~secret/serving-cert major:0 minor:446 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/67d66357-fcee-4e70-b563-5895b978ab55/volumes/kubernetes.io~projected/kube-api-access-sclqq:{mountpoint:/var/lib/kubelet/pods/67d66357-fcee-4e70-b563-5895b978ab55/volumes/kubernetes.io~projected/kube-api-access-sclqq major:0 minor:682 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/67d66357-fcee-4e70-b563-5895b978ab55/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/67d66357-fcee-4e70-b563-5895b978ab55/volumes/kubernetes.io~secret/serving-cert major:0 minor:680 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/741c9d25-7634-41c0-bfe4-b7a15de4b341/volumes/kubernetes.io~projected/kube-api-access-4w7jx:{mountpoint:/var/lib/kubelet/pods/741c9d25-7634-41c0-bfe4-b7a15de4b341/volumes/kubernetes.io~projected/kube-api-access-4w7jx major:0 minor:322 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7b29cb7b-26d2-4fab-9e03-2d7fdf937592/volumes/kubernetes.io~projected/kube-api-access-8hw6b:{mountpoint:/var/lib/kubelet/pods/7b29cb7b-26d2-4fab-9e03-2d7fdf937592/volumes/kubernetes.io~projected/kube-api-access-8hw6b major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7b29cb7b-26d2-4fab-9e03-2d7fdf937592/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/7b29cb7b-26d2-4fab-9e03-2d7fdf937592/volumes/kubernetes.io~secret/srv-cert major:0 minor:508 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8527f5cd-2992-44be-90b8-e9086cedf46e/volumes/kubernetes.io~projected/kube-api-access-qp9jf:{mountpoint:/var/lib/kubelet/pods/8527f5cd-2992-44be-90b8-e9086cedf46e/volumes/kubernetes.io~projected/kube-api-access-qp9jf major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8527f5cd-2992-44be-90b8-e9086cedf46e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/8527f5cd-2992-44be-90b8-e9086cedf46e/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8beda3a0-a653-4810-b3f2-d25badb21ab1/volumes/kubernetes.io~projected/kube-api-access-tgtgw:{mountpoint:/var/lib/kubelet/pods/8beda3a0-a653-4810-b3f2-d25badb21ab1/volumes/kubernetes.io~projected/kube-api-access-tgtgw major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8beda3a0-a653-4810-b3f2-d25badb21ab1/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/8beda3a0-a653-4810-b3f2-d25badb21ab1/volumes/kubernetes.io~secret/webhook-certs major:0 minor:505 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c8ee765-76b8-4cde-8acb-6e5edd1b8149/volumes/kubernetes.io~projected/kube-api-access-djxfs:{mountpoint:/var/lib/kubelet/pods/8c8ee765-76b8-4cde-8acb-6e5edd1b8149/volumes/kubernetes.io~projected/kube-api-access-djxfs major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c8ee765-76b8-4cde-8acb-6e5edd1b8149/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/8c8ee765-76b8-4cde-8acb-6e5edd1b8149/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:545 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8e073eb4-67f2-4de7-8848-50da73079dbc/volumes/kubernetes.io~projected/kube-api-access-9plst:{mountpoint:/var/lib/kubelet/pods/8e073eb4-67f2-4de7-8848-50da73079dbc/volumes/kubernetes.io~projected/kube-api-access-9plst major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9076d131-644a-4332-8a70-34f6b0f71575/volumes/kubernetes.io~projected/kube-api-access-2vcf6:{mountpoint:/var/lib/kubelet/pods/9076d131-644a-4332-8a70-34f6b0f71575/volumes/kubernetes.io~projected/kube-api-access-2vcf6 major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9076d131-644a-4332-8a70-34f6b0f71575/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/9076d131-644a-4332-8a70-34f6b0f71575/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:536 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9076d131-644a-4332-8a70-34f6b0f71575/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/9076d131-644a-4332-8a70-34f6b0f71575/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:541 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/979d4d12-a560-4309-a1d3-cbebe853e8ea/volumes/kubernetes.io~projected/kube-api-access-rxjqg:{mountpoint:/var/lib/kubelet/pods/979d4d12-a560-4309-a1d3-cbebe853e8ea/volumes/kubernetes.io~projected/kube-api-access-rxjqg major:0 minor:117 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9a6c1523-e77c-4aac-814c-05d41215c42f/volumes/kubernetes.io~projected/kube-api-access-m6tp5:{mountpoint:/var/lib/kubelet/pods/9a6c1523-e77c-4aac-814c-05d41215c42f/volumes/kubernetes.io~projected/kube-api-access-m6tp5 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9a6c1523-e77c-4aac-814c-05d41215c42f/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/9a6c1523-e77c-4aac-814c-05d41215c42f/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:544 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~projected/kube-api-access-vl7t5:{mountpoint:/var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~projected/kube-api-access-vl7t5 major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~secret/etcd-client major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~secret/serving-cert major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3dddb56-d180-4b8a-85bd-77c3888d8f71/volumes/kubernetes.io~projected/kube-api-access-nxbdq:{mountpoint:/var/lib/kubelet/pods/a3dddb56-d180-4b8a-85bd-77c3888d8f71/volumes/kubernetes.io~projected/kube-api-access-nxbdq major:0 minor:378 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3dddb56-d180-4b8a-85bd-77c3888d8f71/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/a3dddb56-d180-4b8a-85bd-77c3888d8f71/volumes/kubernetes.io~secret/signing-key major:0 minor:377 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2/volumes/kubernetes.io~projected/kube-api-access-jw2x6:{mountpoint:/var/lib/kubelet/pods/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2/volumes/kubernetes.io~projected/kube-api-access-jw2x6 major:0 minor:726 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bf5dde46-8a95-46a6-bee5-20d3a58f33ee/volumes/kubernetes.io~projected/kube-api-access-6hxq7:{mountpoint:/var/lib/kubelet/pods/bf5dde46-8a95-46a6-bee5-20d3a58f33ee/volumes/kubernetes.io~projected/kube-api-access-6hxq7 major:0 minor:313 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~projected/kube-api-access-v4hqj:{mountpoint:/var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~projected/kube-api-access-v4hqj major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:506 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c5966fa8-b9f0-42ee-a75b-20014782366d/volumes/kubernetes.io~projected/kube-api-access-v6sr8:{mountpoint:/var/lib/kubelet/pods/c5966fa8-b9f0-42ee-a75b-20014782366d/volumes/kubernetes.io~projected/kube-api-access-v6sr8 major:0 minor:87 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d664acc4-ec4f-4078-ae93-404a14ea18fc/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/d664acc4-ec4f-4078-ae93-404a14ea18fc/volumes/kubernetes.io~projected/kube-api-access major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d664acc4-ec4f-4078-ae93-404a14ea18fc/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/d664acc4-ec4f-4078-ae93-404a14ea18fc/volumes/kubernetes.io~secret/serving-cert major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volumes/kubernetes.io~projected/kube-api-access-lvnb9:{mountpoint:/var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volumes/kubernetes.io~projected/kube-api-access-lvnb9 major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dc65ec1f-b8fb-40d6-ac39-46b255a33221/volumes/kubernetes.io~projected/kube-api-access-ww85l:{mountpoint:/var/lib/kubelet/pods/dc65ec1f-b8fb-40d6-ac39-46b255a33221/volumes/kubernetes.io~projected/kube-api-access-ww85l major:0 minor:365 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e/volumes/kubernetes.io~projected/kube-api-access-w5f5s:{mountpoint:/var/lib/kubelet/pods/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e/volumes/kubernetes.io~projected/kube-api-access-w5f5s major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:542 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/volumes/kubernetes.io~projected/kube-api-access major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7fae040-28fa-4d97-8482-fd0dd12cc921/volumes/kubernetes.io~projected/kube-api-access-jqwbw:{mountpoint:/var/lib/kubelet/pods/e7fae040-28fa-4d97-8482-fd0dd12cc921/volumes/kubernetes.io~projected/kube-api-access-jqwbw major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7fae040-28fa-4d97-8482-fd0dd12cc921/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e7fae040-28fa-4d97-8482-fd0dd12cc921/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0c75102-6790-4ed3-84da-61c3611186f8/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/f0c75102-6790-4ed3-84da-61c3611186f8/volumes/kubernetes.io~projected/kube-api-access major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0c75102-6790-4ed3-84da-61c3611186f8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f0c75102-6790-4ed3-84da-61c3611186f8/volumes/kubernetes.io~secret/serving-cert major:0 minor:143 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f585ebb1-6210-463b-af85-fb29e1e7dfa5/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/f585ebb1-6210-463b-af85-fb29e1e7dfa5/volumes/kubernetes.io~projected/ca-certs major:0 minor:348 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f585ebb1-6210-463b-af85-fb29e1e7dfa5/volumes/kubernetes.io~projected/kube-api-access-5g4rw:{mountpoint:/var/lib/kubelet/pods/f585ebb1-6210-463b-af85-fb29e1e7dfa5/volumes/kubernetes.io~projected/kube-api-access-5g4rw major:0 minor:372 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ff98fb1e-7a1f-4657-b085-743d6f2d28e2/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/ff98fb1e-7a1f-4657-b085-743d6f2d28e2/volumes/kubernetes.io~projected/kube-api-access major:0 minor:627 fsType:tmpfs blockSize:0} overlay_0-107:{mountpoint:/var/lib/containers/storage/overlay/49c77cf98e23bcc29d51f526c209e269d2e59f1002100c3a2c756f9d79a7dbd8/merged major:0 minor:107 fsType:overlay blockSize:0} overlay_0-111:{mountpoint:/var/lib/containers/storage/overlay/ab2346c2a8ec1c0db2f0b818783d76a67e678ce7d37dd270feb7117e043ce902/merged major:0 minor:111 fsType:overlay blockSize:0} overlay_0-113:{mountpoint:/var/lib/containers/storage/overlay/85735960416d8dae28d1c4c938859b7212052bdfe098545fedc746816ba45eba/merged major:0 minor:113 fsType:overlay blockSize:0} overlay_0-115:{mountpoint:/var/lib/containers/storage/overlay/d1644848953ed8d9a4344709b4c6f76b6e4cee8c55bffdc83f506f0844589758/merged major:0 minor:115 fsType:overlay blockSize:0} overlay_0-118:{mountpoint:/var/lib/containers/storage/overlay/4c951d3c746ba5ca34c0a9b732afddd55d0a70d9e62f4a2a2b2dcb3a2dd9f0cf/merged major:0 minor:118 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/fe923a01aa149db76793dc17045917d96873c5b54c995671d24b1e685c601c7b/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-132:{mountpoint:/var/lib/containers/storage/overlay/dee62491a119033bb7d50d887f579ca0f06611088f66c382a8139db78fb342a0/merged major:0 minor:132 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/0311f58b992f13129e1493ef6fb7251826fa67edfc89c277a3273a9f2d178cdd/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/6956d109f581ba0a4aa3e6aaec0cdf0d42d2f31858fc15087ad2cda594012f24/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/7998f4274a2238b57e8b2902714659188437a0dbc960443bab9161f4b65e8cc4/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/bc7f3569f50a9078712d7980f75c32e2da6a0b726696bdccc245957c1a1a632b/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/c14136521213d1eb6853566726554217f34f18787c93b6c20c72e5d75aa253f4/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/3b47c5bde3ec022e7d3889500e412e583a2393d578c856f6b86a67d8fd226ef9/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-164:{mountpoint:/var/lib/containers/storage/overlay/e40be4c8888cabd2adee5fd441c862a9d1bb3e1da68fe2e07690d50feafc68e0/merged major:0 minor:164 fsType:overlay blockSize:0} overlay_0-166:{mountpoint:/var/lib/containers/storage/overlay/8720e958ef15eb5868a28c074b946011c4fae1d049f4f1ba42a1e3e747870488/merged major:0 minor:166 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/1c91a1d2f63e80990d6a06bb2c0c1b685bda22da0a356a587f803bf87fdcf9a6/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/2f5b274c557c6a951d5978f3b4c87ef9bcd7579a6ee21cbeb34e12d31edcb5a7/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/982312cd7bee1c13f5ba7515790d39a96ed2c1716815f03376d4b1509f945a68/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/48eb8d26c67ca2a4a7f8f50b51c271eddc022981b7eb60ab3392efc2d96388f1/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/5f850bd024508006765b26f8c7d520fb2bb7ad541b2c30275e097ac1314becbe/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/d3e9581fe7ca74c6e2f6f90231a678caafcca776618b5b4fe4a5edec235b56f2/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/051a95d5a1d5c8375cb1419b880419b56ba1dd1729b44dde3605d95ba1c76397/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-271:{mountpoint:/var/lib/containers/storage/overlay/28b65bee44b8e1c6a3b254bee47d55bd95dd6bbad49e9a53f951c1f4ad2f155d/merged major:0 minor:271 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/659a2fb543fdae93daed72c971cd6a80ecd382cbc5ef7b88056d8af40569d1e0/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-275:{mountpoint:/var/lib/containers/storage/overlay/ee3ba172291ca65b21d577bef0420cf4a07656f251ee8cb3594c143e1740acc3/merged major:0 minor:275 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/aba52ee22245adab58a7cb8da8b39a24ae4399c0efc1b34e9910b24ebf42f61e/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/7199d228cf83653f28117985b0b02fdaa27b01fda041902b16e1a069520aef8f/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/36692e533376d710b6ce6d3e9bb4e3555eb67c330c62bd146b67759e013799a6/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/267a5f646eb6b002ddbe15d0b586ee553fb7df87d84ca3689deaa8ef389b5f8d/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/b80865d8109771457ffc35c51dfc77e7d92c2c523057e06b3daff4af56d5fabd/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-288:{mountpoint:/var/lib/containers/storage/overlay/ca66014f55fd2451cf30d22b7dab301f6f102ef3ab4e7b3d17a96ce77e3bd4f5/merged major:0 minor:288 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/768745a4a94d5f49e975592d1c011de5ceedf48cd4986660d6917ce8d533c5d6/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/7aa19a329c9e16d2fe6e87e1a54ba48f5e65bfe432c5a57b213a3383e7f4d376/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/a0dae084494ee1f35434a45169f3811c9c66f4b3e2afc9db24b79287438fe9e5/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/a8fee7047db8240d64bbf7d5ee6342c15214c8d307aed58d1789dfe023b523fb/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-301:{mountpoint:/var/lib/containers/storage/overlay/0260cfa862ad7cfa15dcc66117414aba171e5f52cb4dd892f74739b1e86aa71e/merged major:0 minor:301 fsType:overlay blockSize:0} overlay_0-305:{mountpoint:/var/lib/containers/storage/overlay/0cfe2253c96abf508af9315468577b11fcc8d1ae43c432541d59bdf5ee711645/merged major:0 minor:305 fsType:overlay blockSize:0} overlay_0-320:{mountpoint:/var/lib/containers/storage/overlay/c9b1c24eef6192e05538284960dd4b3b0c3f71260c558e8b0c9a4f91461d71f7/merged major:0 minor:320 fsType:overlay blockSize:0} overlay_0-323:{mountpoint:/var/lib/containers/storage/overlay/a41584b045e315cb361fdd03a2648e51ebeccfb2bdb24779785ad7f08708bf18/merged major:0 minor:323 fsType:overlay blockSize:0} overlay_0-328:{mountpoint:/var/lib/containers/storage/overlay/787a06f7779d24e368f7d6a2cf7aeedbfeb222da8b48771e5bddd85931c5ea49/merged major:0 minor:328 fsType:overlay blockSize:0} overlay_0-330:{mountpoint:/var/lib/containers/storage/overlay/1f33b0c4ad8c2f7ccc5f7162cfc9c5c0cc606516b0560f174163c5a27b0e2ca0/merged major:0 minor:330 fsType:overlay blockSize:0} overlay_0-332:{mountpoint:/var/lib/containers/storage/overlay/b147fbc72c26ebcd9c5faa0008acba975c08f15b570200452da35b18a4f0ef82/merged major:0 minor:332 fsType:overlay blockSize:0} overlay_0-334:{mountpoint:/var/lib/containers/storage/overlay/2a3ca95b948adb5ac81cfb339672f27bd0ded1edfa62c5513296e08dafea63b9/merged major:0 minor:334 fsType:overlay blockSize:0} overlay_0-336:{mountpoint:/var/lib/containers/storage/overlay/c3c4e77a839c85d06b895981bfb0b94aa13482f916a0d7b044664db3c2ca5c42/merged major:0 minor:336 fsType:overlay blockSize:0} overlay_0-337:{mountpoint:/var/lib/containers/storage/overlay/565ca8091227d0bfa01dd53ccdf99b937b9288a42269c4ebbb4913926216a610/merged major:0 minor:337 fsType:overlay blockSize:0} overlay_0-339:{mountpoint:/var/lib/containers/storage/overlay/214821d7beb035708cad419a3ce1e5d0756a07ab4152fb734b871900863d23c7/merged major:0 minor:339 fsType:overlay blockSize:0} overlay_0-342:{mountpoint:/var/lib/containers/storage/overlay/7b626fea710d3d94073eddd338e64338d472272f5fc5891d161e87e974eb2915/merged major:0 minor:342 fsType:overlay blockSize:0} overlay_0-351:{mountpoint:/var/lib/containers/storage/overlay/4497bc522a37be6ec83630587fa6608c9d2d4d2f947190c95b339bd50fcda42a/merged major:0 minor:351 fsType:overlay blockSize:0} overlay_0-357:{mount Mar 19 09:23:34.756274 master-0 kubenswrapper[13205]: point:/var/lib/containers/storage/overlay/cd87ba7a68728c7bcb47cd591ab06909e9ef9468b051900c64a80ee758679331/merged major:0 minor:357 fsType:overlay blockSize:0} overlay_0-369:{mountpoint:/var/lib/containers/storage/overlay/f7c905d165568fd119a6f3b2e5f9fd2eea943b25e8780ea17a982ad7b18c8a1e/merged major:0 minor:369 fsType:overlay blockSize:0} overlay_0-375:{mountpoint:/var/lib/containers/storage/overlay/cc07a6e716c587829f2db1e6be250027b7d67cf482d848f119a8bd165f3abef6/merged major:0 minor:375 fsType:overlay blockSize:0} overlay_0-381:{mountpoint:/var/lib/containers/storage/overlay/1d0c3da20b01904d864ee93d5fae72f28f8942aa3b4dc94e1fca81804dacb236/merged major:0 minor:381 fsType:overlay blockSize:0} overlay_0-383:{mountpoint:/var/lib/containers/storage/overlay/4a51e442d87abc045dc9bb0857badd677c9c135e7f131866edf1471d6eb37674/merged major:0 minor:383 fsType:overlay blockSize:0} overlay_0-389:{mountpoint:/var/lib/containers/storage/overlay/f07b25a0e2df3afc2c99c7188a95d338bf0e4ba2cd9e8d72c4596449066e0bfb/merged major:0 minor:389 fsType:overlay blockSize:0} overlay_0-391:{mountpoint:/var/lib/containers/storage/overlay/ca810814d413042c505c74819327a95fa25ad317bb5e80e092e237bc2ab0effb/merged major:0 minor:391 fsType:overlay blockSize:0} overlay_0-398:{mountpoint:/var/lib/containers/storage/overlay/6dda947d3ada3d2661872ea04bd0d5f2c2d28e24675934b8b04a6ea4aaef6209/merged major:0 minor:398 fsType:overlay blockSize:0} overlay_0-399:{mountpoint:/var/lib/containers/storage/overlay/7d6f9e800b0223faac5d127de94542ced7c1dd2be4482bd2c9e90e5150728135/merged major:0 minor:399 fsType:overlay blockSize:0} overlay_0-408:{mountpoint:/var/lib/containers/storage/overlay/06d52797fec2a0b1c249bdad8c5cef822576098ae437ae2d72a276faf21831b5/merged major:0 minor:408 fsType:overlay blockSize:0} overlay_0-416:{mountpoint:/var/lib/containers/storage/overlay/c0915e50de0cf7723aab0ff93cba9b064c9bc62e9bfcd6f48be9ebb46c05d1a9/merged major:0 minor:416 fsType:overlay blockSize:0} overlay_0-418:{mountpoint:/var/lib/containers/storage/overlay/65000d5c8352320b300dc297d9e83cd59cf5c9710f16e4fe10dd55806dc65baa/merged major:0 minor:418 fsType:overlay blockSize:0} overlay_0-420:{mountpoint:/var/lib/containers/storage/overlay/a9c2e3d69fa888c2128b04ff6ba66461cb8f7ecb89683378b7112aa44b08713c/merged major:0 minor:420 fsType:overlay blockSize:0} overlay_0-422:{mountpoint:/var/lib/containers/storage/overlay/11587c5cd3cec163e7338c481e06869de10aa875b36ae3b5faad3090a4eff798/merged major:0 minor:422 fsType:overlay blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/814650a8e5a66d48479431897983c3fdd5edb710dbca0978e207c78edc55b856/merged major:0 minor:43 fsType:overlay blockSize:0} overlay_0-431:{mountpoint:/var/lib/containers/storage/overlay/683ed4361ca4b976929fa36655ad9b8ccf19495fb8db9b787ebab3e03701ccb5/merged major:0 minor:431 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/94a85b89bd7de75c8ed4ade46a649e50c4f11e7b7eba10f0247cf3b77b395b32/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-444:{mountpoint:/var/lib/containers/storage/overlay/cf66c17dacf109bf89a5afe7d33e602da2333fea0d359139e81620c29c6adb6f/merged major:0 minor:444 fsType:overlay blockSize:0} overlay_0-449:{mountpoint:/var/lib/containers/storage/overlay/4d0c06e8ba9039b159274b44334dc6433603714b43bd0d3942d88848d487d8a8/merged major:0 minor:449 fsType:overlay blockSize:0} overlay_0-451:{mountpoint:/var/lib/containers/storage/overlay/53b46b880626721f13b2fbf06204bd172d65a15726a25f27a4c4218dba1adbe5/merged major:0 minor:451 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/1b4e232d56de6ea4a3ee00bc47233a748fc763f4071998d66578051c1610d0e5/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-463:{mountpoint:/var/lib/containers/storage/overlay/8e2f62854fd89546156d90abd2c886c7771a102a02c21ed54e57eee0db2a9e94/merged major:0 minor:463 fsType:overlay blockSize:0} overlay_0-464:{mountpoint:/var/lib/containers/storage/overlay/2e2fa58a4282d7f78805f69bd13e50c0b7b13369e6d019acb3e91e5be4743e2e/merged major:0 minor:464 fsType:overlay blockSize:0} overlay_0-465:{mountpoint:/var/lib/containers/storage/overlay/763dfe5014c6574f20a8371875bd7bf411981cfcca9bd67cd18cc4c89f290600/merged major:0 minor:465 fsType:overlay blockSize:0} overlay_0-467:{mountpoint:/var/lib/containers/storage/overlay/a9ec511c6d2d236427bd032e18a98df945c4e2b6de35b3db45bae2d13d927ef0/merged major:0 minor:467 fsType:overlay blockSize:0} overlay_0-469:{mountpoint:/var/lib/containers/storage/overlay/d57311ed3e088792d21bd92e201bf7d8b06ea6f006953d51eb01f4e23f305c6a/merged major:0 minor:469 fsType:overlay blockSize:0} overlay_0-471:{mountpoint:/var/lib/containers/storage/overlay/16be01094c7ee99b1bd9bd4c62cfefe9ac24380a223ca33c7e362205f0aa98cc/merged major:0 minor:471 fsType:overlay blockSize:0} overlay_0-473:{mountpoint:/var/lib/containers/storage/overlay/63443006c303ba0cf49696f923974595eb67294b3e7f2b76d7bc383e93494dfc/merged major:0 minor:473 fsType:overlay blockSize:0} overlay_0-478:{mountpoint:/var/lib/containers/storage/overlay/06b67817d68e08972d8f87f1c015a44f1102415f8274c9290ec19fc48eeb50c0/merged major:0 minor:478 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/f6b3edf92c395afa60665b3832b209031a8503ab440ce26483c06caf480342b8/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-502:{mountpoint:/var/lib/containers/storage/overlay/d9b45e81e1301344ecc044bc28174c3c38c633491d511629f4e5a528c506fa7e/merged major:0 minor:502 fsType:overlay blockSize:0} overlay_0-503:{mountpoint:/var/lib/containers/storage/overlay/fcadb4ffac01d4d00a13ee53111c03e49bc3ce8a427455bdf0d5e7f8a9371c4e/merged major:0 minor:503 fsType:overlay blockSize:0} overlay_0-514:{mountpoint:/var/lib/containers/storage/overlay/a9b1550830ef03fb81a2f15fefba0cb95eca6b942a8f0f49e2f763c766bd4a41/merged major:0 minor:514 fsType:overlay blockSize:0} overlay_0-519:{mountpoint:/var/lib/containers/storage/overlay/1537e4fa583edac4763c786865e95cccde25c35aa9aac37cdff1f9deeebb5d27/merged major:0 minor:519 fsType:overlay blockSize:0} overlay_0-528:{mountpoint:/var/lib/containers/storage/overlay/a2410f81b9f2ce0c19d817c1deaa720019d6a759851352263c591fc751768516/merged major:0 minor:528 fsType:overlay blockSize:0} overlay_0-53:{mountpoint:/var/lib/containers/storage/overlay/bf39149e37557c8641283a8132fc65036077dd921dad9cfbb853092ff5ebe6fc/merged major:0 minor:53 fsType:overlay blockSize:0} overlay_0-565:{mountpoint:/var/lib/containers/storage/overlay/f380601f08cca0c785a0dbe964d84ecc96cd7415d562e52fa62331f80ffbeb82/merged major:0 minor:565 fsType:overlay blockSize:0} overlay_0-568:{mountpoint:/var/lib/containers/storage/overlay/bddbd7403da4e42f5c1b43fa9045aecc6133cd0646d21dae09dc2bf0fd52249a/merged major:0 minor:568 fsType:overlay blockSize:0} overlay_0-570:{mountpoint:/var/lib/containers/storage/overlay/2d91ec597edd5b6860b2671fd8b56e711511dc08953785982d23c155a99e41ba/merged major:0 minor:570 fsType:overlay blockSize:0} overlay_0-572:{mountpoint:/var/lib/containers/storage/overlay/dc64977413072fc347bd08edeeefe091f055e1bbfc03941d7b0023fd5cdf47fb/merged major:0 minor:572 fsType:overlay blockSize:0} overlay_0-574:{mountpoint:/var/lib/containers/storage/overlay/3d2302bfc6d0f12e4ac863adf49d17fb24dcf8cabbe65f64e78d3fbfb92c4787/merged major:0 minor:574 fsType:overlay blockSize:0} overlay_0-576:{mountpoint:/var/lib/containers/storage/overlay/d0a24514fc17bb969d70319d1c7205240cfb6ddcba178df4421f31cd25ee48b7/merged major:0 minor:576 fsType:overlay blockSize:0} overlay_0-578:{mountpoint:/var/lib/containers/storage/overlay/1bae1c6e8a21bdb06064171ddc9a2692867fc3b828f990884ebae73d15fc064a/merged major:0 minor:578 fsType:overlay blockSize:0} overlay_0-585:{mountpoint:/var/lib/containers/storage/overlay/c2fb86a70dc99975c2644b80a6401536e8bf527c0bbfd6a84a3c35916543d495/merged major:0 minor:585 fsType:overlay blockSize:0} overlay_0-587:{mountpoint:/var/lib/containers/storage/overlay/8b7eecab0a535401dcd96c256abb7b792c7cb376da38e663b84bd5705223b645/merged major:0 minor:587 fsType:overlay blockSize:0} overlay_0-589:{mountpoint:/var/lib/containers/storage/overlay/33f962df3805530997ba51098de5bc84da5c1cd132eb0689321b3437ab650c11/merged major:0 minor:589 fsType:overlay blockSize:0} overlay_0-591:{mountpoint:/var/lib/containers/storage/overlay/871e203e321e540d9d773e24c2073e24e18a7a6dfa2cec4fe4326be91bcf153f/merged major:0 minor:591 fsType:overlay blockSize:0} overlay_0-599:{mountpoint:/var/lib/containers/storage/overlay/ab4cd4c30819255c896da9cb7b6eeda085751bf038e361ec26e9437cb96da12d/merged major:0 minor:599 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/2ba0c2cb9c02a192cb30a7e41dc61695dfa4ef4406d228b4a33dfb67cef6a0d9/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-600:{mountpoint:/var/lib/containers/storage/overlay/212081992c09b783d896850e43847bc9bed7e28764cc8a4b77ca946556a2e2a0/merged major:0 minor:600 fsType:overlay blockSize:0} overlay_0-607:{mountpoint:/var/lib/containers/storage/overlay/701ec95047387c0a644db622070a7417760d5a609097040d4268d5fad560b4c4/merged major:0 minor:607 fsType:overlay blockSize:0} overlay_0-642:{mountpoint:/var/lib/containers/storage/overlay/ee92a14b77d3fd5e1ab02b11f3349fb249d5eb7a01e100385c3b57257733ea9a/merged major:0 minor:642 fsType:overlay blockSize:0} overlay_0-649:{mountpoint:/var/lib/containers/storage/overlay/f7d98e8b1c063479e9f3d979227180d5e92fbdee28352b7e180861a0042b0b7c/merged major:0 minor:649 fsType:overlay blockSize:0} overlay_0-65:{mountpoint:/var/lib/containers/storage/overlay/35c9e0de3a3ac2b8e3d9066694f64035d615a00657b378f89fd318155fdc415b/merged major:0 minor:65 fsType:overlay blockSize:0} overlay_0-665:{mountpoint:/var/lib/containers/storage/overlay/42420892e71b5234468d3f171fbe9ee3a760f23e774b26abfef40f73132c3daa/merged major:0 minor:665 fsType:overlay blockSize:0} overlay_0-667:{mountpoint:/var/lib/containers/storage/overlay/339de05156af2d38113fcb7e1a41cedbc7dc5a89d1316055ba5a4fa9363274d3/merged major:0 minor:667 fsType:overlay blockSize:0} overlay_0-671:{mountpoint:/var/lib/containers/storage/overlay/901f234212f0c60a186cb938e375e40e0e699c18fb826ffcdc4142ede2d10d16/merged major:0 minor:671 fsType:overlay blockSize:0} overlay_0-687:{mountpoint:/var/lib/containers/storage/overlay/3b37b7a866fbda668b447fa7a52344aafc0d9c63ce2a5a3168a5e9a122b3c519/merged major:0 minor:687 fsType:overlay blockSize:0} overlay_0-689:{mountpoint:/var/lib/containers/storage/overlay/d3782a9d201cf37b90e63d3d2dcb34b1606c820147d352ab8043e7538db4d91e/merged major:0 minor:689 fsType:overlay blockSize:0} overlay_0-691:{mountpoint:/var/lib/containers/storage/overlay/b230933e4a8a7757c337fc5f50bc2fe4f31cefaa3967bda7dbc42619923c6fbb/merged major:0 minor:691 fsType:overlay blockSize:0} overlay_0-692:{mountpoint:/var/lib/containers/storage/overlay/9842dd8bfbd0269094badc02223f917236e5bcc1c1afc6cfef029f74ca46316a/merged major:0 minor:692 fsType:overlay blockSize:0} overlay_0-694:{mountpoint:/var/lib/containers/storage/overlay/75cef4f105c63b17105e287edaffe4d5a2aa21f60837f4ea97b3e5e35d497322/merged major:0 minor:694 fsType:overlay blockSize:0} overlay_0-700:{mountpoint:/var/lib/containers/storage/overlay/759e41ef43835d0aeafb94d807bca2052b991699605fc89387ba5cb23688ae8a/merged major:0 minor:700 fsType:overlay blockSize:0} overlay_0-71:{mountpoint:/var/lib/containers/storage/overlay/638afd5d2affb24b47bacc98cea5c20ca74668919598cadc566514444769bdb0/merged major:0 minor:71 fsType:overlay blockSize:0} overlay_0-712:{mountpoint:/var/lib/containers/storage/overlay/dc285b46c90ae9dce0195e1928070b36de8c249c1bcc462b2d2b561475c6b29b/merged major:0 minor:712 fsType:overlay blockSize:0} overlay_0-714:{mountpoint:/var/lib/containers/storage/overlay/f9a2fec5df45165f84f33750522e603b291eeaf4bf11bb88673513d97d9550ad/merged major:0 minor:714 fsType:overlay blockSize:0} overlay_0-729:{mountpoint:/var/lib/containers/storage/overlay/4d50a65482f65e2d4f83e2d6cbd3a2b795c53acb83ccf33a3fa84156b377d4d5/merged major:0 minor:729 fsType:overlay blockSize:0} overlay_0-731:{mountpoint:/var/lib/containers/storage/overlay/b00646cc23d8d7d44c37793ba3e99e6d2253af5d23bb2fd6de194dc674973a24/merged major:0 minor:731 fsType:overlay blockSize:0} overlay_0-77:{mountpoint:/var/lib/containers/storage/overlay/8e24ce30a103b02e9b47649f3a5f254397c0a9d20fa1bf2cb0041704756b9117/merged major:0 minor:77 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/2212c30a96969fa1e46a4eb46e9adc594683a68150e7cba9b67fbf665ee42a9b/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-80:{mountpoint:/var/lib/containers/storage/overlay/635ed305b952d2952456a6c8e453220a1c9c67400413149578932c33af0c1171/merged major:0 minor:80 fsType:overlay blockSize:0} overlay_0-81:{mountpoint:/var/lib/containers/storage/overlay/d5b45747c23d13199fbcd2493585a67f93f49ddb4d811058535028a1d13681b4/merged major:0 minor:81 fsType:overlay blockSize:0} overlay_0-84:{mountpoint:/var/lib/containers/storage/overlay/479f30ee16080c17fdeb9a3334aac0aad0be2861088f582207c9a2881db5c326/merged major:0 minor:84 fsType:overlay blockSize:0} overlay_0-85:{mountpoint:/var/lib/containers/storage/overlay/8cd7e2fc4243b2b3c4095030d51795eee6646111c728baffb78a3028898da602/merged major:0 minor:85 fsType:overlay blockSize:0} overlay_0-90:{mountpoint:/var/lib/containers/storage/overlay/ad6010984a51e47e089f5154245e36b05c8ea20c6f4e8cd5e5da4332e3501284/merged major:0 minor:90 fsType:overlay blockSize:0} overlay_0-92:{mountpoint:/var/lib/containers/storage/overlay/921be157f806463c58162d3b283d375ae32bd9157dba33fbf2ae2d9d2a1aa9b4/merged major:0 minor:92 fsType:overlay blockSize:0} overlay_0-93:{mountpoint:/var/lib/containers/storage/overlay/4ce68edc06510c173da7f0ba8d75202e39c586f4c15834e4f76527ca997d5d18/merged major:0 minor:93 fsType:overlay blockSize:0}] Mar 19 09:23:34.781681 master-0 kubenswrapper[13205]: I0319 09:23:34.780801 13205 manager.go:217] Machine: {Timestamp:2026-03-19 09:23:34.78002266 +0000 UTC m=+0.112329578 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:dab19efcf33543febdac139f3c303589 SystemUUID:dab19efc-f335-43fe-bdac-139f3c303589 BootID:870de220-908c-4452-8349-8f04a86857c3 Filesystems:[{Device:/var/lib/kubelet/pods/4abcf2ea-50f5-4d62-8a23-583438e5b451/volumes/kubernetes.io~projected/kube-api-access-2hnvh DeviceMajor:0 DeviceMinor:102 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/8c8ee765-76b8-4cde-8acb-6e5edd1b8149/volumes/kubernetes.io~projected/kube-api-access-djxfs DeviceMajor:0 DeviceMinor:229 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2451d7e3dd79303504d5964f5bc9fe498e3fce32e9bf236a0e1ab73d89c4fa39/userdata/shm DeviceMajor:0 DeviceMinor:556 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2c994266a2d02f0511c56e203e02c66ec993c8a4956cebe37152ed3179a4c4ff/userdata/shm DeviceMajor:0 DeviceMinor:558 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-463 DeviceMajor:0 DeviceMinor:463 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4e6b0fbcf10efb9d378e7013c9ee95c7eea5f13187283f4e3dcc1192d68f1166/userdata/shm DeviceMajor:0 DeviceMinor:552 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-570 DeviceMajor:0 DeviceMinor:570 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9afd6ea2c1d8f05e8e4fc03f47178ac0a2f4931512d72e1dd34b6edbe52cf174/userdata/shm DeviceMajor:0 DeviceMinor:109 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/82537ef54e6382f32d5fe80aff3875a880fc715c44848fde8e9d22a20125f223/userdata/shm DeviceMajor:0 DeviceMinor:499 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c09069a2109d448b73c63e5e3d2a41051b8198531ad6e6a692843369313b17a8/userdata/shm DeviceMajor:0 DeviceMinor:727 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/9076d131-644a-4332-8a70-34f6b0f71575/volumes/kubernetes.io~projected/kube-api-access-2vcf6 DeviceMajor:0 DeviceMinor:226 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~projected/kube-api-access-vl7t5 DeviceMajor:0 DeviceMinor:230 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0865b3dd8e414cf36fa73f2f26a1125029b6401943086385aaef6e6adbd387e7/userdata/shm DeviceMajor:0 DeviceMinor:267 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/64f60856-22dd-4560-acff-c620e17844a1/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:405 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-528 DeviceMajor:0 DeviceMinor:528 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8bbb7eb717a10731a76fbab7e75a4760990dac18f169f5c55d4ff290082a576b/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/10c609bb-136a-4ce2-b9e2-0a03e1a37a62/volumes/kubernetes.io~projected/kube-api-access-tpgbq DeviceMajor:0 DeviceMinor:297 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-420 DeviceMajor:0 DeviceMinor:420 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5a51c701-7f2a-4332-a301-746e8a0eb475/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:437 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/3374940a-612d-4335-8236-3ffe8d6e73a5/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:409 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/7b29cb7b-26d2-4fab-9e03-2d7fdf937592/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:508 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-408 DeviceMajor:0 DeviceMinor:408 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-689 DeviceMajor:0 DeviceMinor:689 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d146d705aef07c62cae684b04e48b6f5db2109ae5a19f2d427637f8db5f61221/userdata/shm DeviceMajor:0 DeviceMinor:421 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e/volumes/kubernetes.io~projected/kube-api-access-w5f5s DeviceMajor:0 DeviceMinor:240 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/16d2930b-486b-492d-983e-c6702d8f53a7/volumes/kubernetes.io~projected/kube-api-access-h5hk6 DeviceMajor:0 DeviceMinor:245 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-665 DeviceMajor:0 DeviceMinor:665 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-712 DeviceMajor:0 DeviceMinor:712 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f585ebb1-6210-463b-af85-fb29e1e7dfa5/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:348 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-600 DeviceMajor:0 DeviceMinor:600 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-714 DeviceMajor:0 DeviceMinor:714 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7b29cb7b-26d2-4fab-9e03-2d7fdf937592/volumes/kubernetes.io~projected/kube-api-access-8hw6b DeviceMajor:0 DeviceMinor:236 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-469 DeviceMajor:0 DeviceMinor:469 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-418 DeviceMajor:0 DeviceMinor:418 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6bf8f167b730f8b123fa119481aeceac0bccae7e125576f133fb9531cd659c54/userdata/shm DeviceMajor:0 DeviceMinor:685 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-328 DeviceMajor:0 DeviceMinor:328 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e3a470e3bacc4ee90522d655c1cb49f2266b41a208ae2967afd423c830e462e3/userdata/shm DeviceMajor:0 DeviceMinor:255 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-649 DeviceMajor:0 DeviceMinor:649 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c67ba3f4e9bb95eef468edeb24c18cd6982feefa1823f748db64378aa999c140/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/64f60856-22dd-4560-acff-c620e17844a1/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:406 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4/volumes/kubernetes.io~projected/kube-api-access-9blbc DeviceMajor:0 DeviceMinor:227 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/259794ab-d027-497a-b08e-5a6d79057668/volumes/kubernetes.io~projected/kube-api-access-6v88k DeviceMajor:0 DeviceMinor:232 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2702ffe2a7096514b0cf147d61b08f45ac487590697d47b826f39e03c4994a7d/userdata/shm DeviceMajor:0 DeviceMinor:366 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-389 DeviceMajor:0 DeviceMinor:389 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-164 DeviceMajor:0 DeviceMinor:164 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/43cb2a3b-40e2-45ee-894a-6c833ee17efd/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/8e073eb4-67f2-4de7-8848-50da73079dbc/volumes/kubernetes.io~projected/kube-api-access-9plst DeviceMajor:0 DeviceMinor:246 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/963f71e764d046880085fa5f09ddf4d6f88636354e79d8ab2e64d52ec74b74ae/userdata/shm DeviceMajor:0 DeviceMinor:257 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5a51c701-7f2a-4332-a301-746e8a0eb475/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:414 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-585 DeviceMajor:0 DeviceMinor:585 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-729 DeviceMajor:0 DeviceMinor:729 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-132 DeviceMajor:0 DeviceMinor:132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/67d66357-fcee-4e70-b563-5895b978ab55/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:680 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a68ad4116cab88705ddf2fb479c6fa07f6cc567a78a2d33208b00017ebb5225f/userdata/shm DeviceMajor:0 DeviceMinor:263 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/64f60856-22dd-4560-acff-c620e17844a1/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:446 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-301 DeviceMajor:0 DeviceMinor:301 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/kubelet/pods/4abcf2ea-50f5-4d62-8a23-583438e5b451/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:98 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c8885dfa43b9e4c2a58db6e5ff12c1dfdfe9193837daeb55173993661ea9f46a/userdata/shm DeviceMajor:0 DeviceMinor:251 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-574 DeviceMajor:0 DeviceMinor:574 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ef86c160aaf4ed6a2febd660641341c71096c2c568217ab433cd656af3876942/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1529b41a10d1658d384e0b7a36c11f0035fc8f768b5a9de54629908bbe77762e/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-334 DeviceMajor:0 DeviceMinor:334 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-449 DeviceMajor:0 DeviceMinor:449 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b34e9a2b33556321559c3a7fd34bd69e2a162921e3a485dc8edf1c710c34dfa7/userdata/shm DeviceMajor:0 DeviceMinor:56 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-451 DeviceMajor:0 DeviceMinor:451 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1132a72c9136c6d33d6382355fa3991b260d8a3776fc503599fe4ecedb8985b2/userdata/shm DeviceMajor:0 DeviceMinor:73 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-92 DeviceMajor:0 DeviceMinor:92 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:542 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-84 DeviceMajor:0 DeviceMinor:84 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-111 DeviceMajor:0 DeviceMinor:111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~projected/kube-api-access-6bdnt DeviceMajor:0 DeviceMinor:244 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-391 DeviceMajor:0 DeviceMinor:391 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-444 DeviceMajor:0 DeviceMinor:444 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9e6502d00c4d560279a6b84e0eac2864639061d852a900f00e6d52ff81453134/userdata/shm DeviceMajor:0 DeviceMinor:559 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-323 DeviceMajor:0 DeviceMinor:323 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3b333a1e-2a7f-423a-8b40-99f30c89f740/volumes/kubernetes.io~projected/kube-api-access-xvd6f DeviceMajor:0 DeviceMinor:234 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9c83b15dba69144ea146a0efbf84c34a34a7bbb646a98c775e5d8f6252c9784a/userdata/shm DeviceMajor:0 DeviceMinor:459 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c604e07b23c824fe44edd155fd3bcc4d87de07b9af516a6fc04d64e9a7ef4a11/userdata/shm DeviceMajor:0 DeviceMinor:554 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-568 DeviceMajor:0 DeviceMinor:568 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-77 DeviceMajor:0 DeviceMinor:77 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/68534953ce4f9d64b4ca25577e4617ff34537dd8175ec1c79125e169063bd6f3/userdata/shm DeviceMajor:0 DeviceMinor:547 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-369 DeviceMajor:0 DeviceMinor:369 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:506 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f2b06f4e36c66727358cf033020ec300296b581135453e3576489d12e345e41e/userdata/shm DeviceMajor:0 DeviceMinor:669 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3eefb4e7e53cb0d4c4a18064ef7d910510d0608a89ca096908a4ffccd0aaebda/userdata/shm DeviceMajor:0 DeviceMinor:683 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/8527f5cd-2992-44be-90b8-e9086cedf46e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-502 DeviceMajor:0 DeviceMinor:502 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9a6c1523-e77c-4aac-814c-05d41215c42f/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:544 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-565 DeviceMajor:0 DeviceMinor:565 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-85 DeviceMajor:0 DeviceMinor:85 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-399 DeviceMajor:0 DeviceMinor:399 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f585ebb1-6210-463b-af85-fb29e1e7dfa5/volumes/kubernetes.io~projected/kube-api-access-5g4rw DeviceMajor:0 DeviceMinor:372 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-416 DeviceMajor:0 DeviceMinor:416 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3d5b3f08e9980af7a4eb46a62a5af4211db365f64342fe1705f26fa41b7b1331/userdata/shm DeviceMajor:0 DeviceMinor:551 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-587 DeviceMajor:0 DeviceMinor:587 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/67d66357-fcee-4e70-b563-5895b978ab55/volumes/kubernetes.io~projected/kube-api-access-sclqq DeviceMajor:0 DeviceMinor:682 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-81 DeviceMajor:0 DeviceMinor:81 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-53 DeviceMajor:0 DeviceMinor:53 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-107 DeviceMajor:0 DeviceMinor:107 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f0c75102-6790-4ed3-84da-61c3611186f8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:143 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:223 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/43cb2a3b-40e2-45ee-894a-6c833ee17efd/volumes/kubernetes.io~projected/kube-api-access-vf6dq DeviceMajor:0 DeviceMinor:235 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/51b88818-5108-40db-90c8-4f2e7198959e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:543 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/41659a48-5eea-41cd-8b2a-b683dc15cc11/volumes/kubernetes.io~projected/kube-api-access-jtw68 DeviceMajor:0 DeviceMinor:125 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-465 DeviceMajor:0 DeviceMinor:465 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d6cda39585354e47346ec04d7e9023161d8c669dfe02492069483d076fdb9801/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-700 DeviceMajor:0 DeviceMinor:700 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-692 DeviceMajor:0 DeviceMinor:692 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-503 DeviceMajor:0 DeviceMinor:503 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8c8ee765-76b8-4cde-8acb-6e5edd1b8149/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:545 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-589 DeviceMajor:0 DeviceMinor:589 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-642 DeviceMajor:0 DeviceMinor:642 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-671 DeviceMajor:0 DeviceMinor:671 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/944648f39111fd7c1a6ed081666cf0303ca2a6eb595623e82619c7478d3372ab/userdata/shm DeviceMajor:0 DeviceMinor:265 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/8beda3a0-a653-4810-b3f2-d25badb21ab1/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:505 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/979d4d12-a560-4309-a1d3-cbebe853e8ea/volumes/kubernetes.io~projected/kube-api-access-rxjqg DeviceMajor:0 DeviceMinor:117 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volumes/kubernetes.io~projected/kube-api-access-lvnb9 DeviceMajor:0 DeviceMinor:127 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:225 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/e7fae040-28fa-4d97-8482-fd0dd12cc921/volumes/kubernetes.io~projected/kube-api-access-jqwbw DeviceMajor:0 DeviceMinor:228 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/f0c75102-6790-4ed3-84da-61c3611186f8/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:242 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/58a6496fefda9dc10f2cd3d711f675ec3a41cf0c8719af9244e86cc4f0694683/userdata/shm DeviceMajor:0 DeviceMinor:247 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/bf5dde46-8a95-46a6-bee5-20d3a58f33ee/volumes/kubernetes.io~projected/kube-api-access-6hxq7 DeviceMajor:0 DeviceMinor:313 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/741c9d25-7634-41c0-bfe4-b7a15de4b341/volumes/kubernetes.io~projected/kube-api-access-4w7jx DeviceMajor:0 DeviceMinor:322 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/9076d131-644a-4332-8a70-34f6b0f71575/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:536 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/202f921c55b75970ad21b44a2165cf9cf2366346959189388624b5cff168cafb/userdata/shm DeviceMajor:0 DeviceMinor:548 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-90 DeviceMajor:0 DeviceMinor:90 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-336 DeviceMajor:0 DeviceMinor:336 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3b50118d-f7c2-4bff-aca0-5c6623819baf/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-519 DeviceMajor:0 DeviceMinor:519 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1d5e311c-1c6a-4d5d-8c2b-493025593934/volumes/kubernetes.io~projected/kube-api-access-49fpz DeviceMajor:0 DeviceMinor:681 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1c57ea8e09d1325e300d649233ce1b17315dc34efabdbd2fd35d3a0b5c00a757/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-398 DeviceMajor:0 DeviceMinor:398 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0ee32bb670dc76513805d1b62d5fffdb198c07008f45dbefa73a8b74cfb40229/userdata/shm DeviceMajor:0 DeviceMinor:249 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-572 DeviceMajor:0 DeviceMinor:572 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-339 DeviceMajor:0 DeviceMinor:339 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-320 DeviceMajor:0 DeviceMinor:320 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3b333a1e-2a7f-423a-8b40-99f30c89f740/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:221 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2e855eb091c3c19e500b45068cb0c0c879f09feb5f816f86f9d1253a9f1c5dcd/userdata/shm DeviceMajor:0 DeviceMinor:373 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/db26592d73a7736c3517fdc2847e1043ee77738e2bbfdfcabaf0fa7701a43b04/userdata/shm DeviceMajor:0 DeviceMinor:379 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/16d2930b-486b-492d-983e-c6702d8f53a7/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:491 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-113 DeviceMajor:0 DeviceMinor:113 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/58ea8fcc-29b2-48ef-8629-2ba217c9d70c/volumes/kubernetes.io~projected/kube-api-access-sfq74 DeviceMajor:0 DeviceMinor:138 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/9076d131-644a-4332-8a70-34f6b0f71575/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:541 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-464 DeviceMajor:0 DeviceMinor:464 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-422 DeviceMajor:0 DeviceMinor:422 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-357 DeviceMajor:0 DeviceMinor:357 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-383 DeviceMajor:0 DeviceMinor:383 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-332 DeviceMajor:0 DeviceMinor:332 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-667 DeviceMajor:0 DeviceMinor:667 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/76a2be65b345aaa03d42847ddf4106be40d256a72f66630810b64aeb72f9c081/userdata/shm DeviceMajor:0 DeviceMinor:52 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/13072c08-c77c-4170-9ebe-98d63968747b/volumes/kubernetes.io~projected/kube-api-access-clpb5 DeviceMajor:0 DeviceMinor:123 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:217 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/64f60856-22dd-4560-acff-c620e17844a1/volumes/kubernetes.io~projected/kube-api-access-cf5jl DeviceMajor:0 DeviceMinor:407 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f1b224e2c3a94acf439ad26c1933da88f3f4e0a2666ac3998bdb5a26f2159e15/userdata/shm DeviceMajor:0 DeviceMinor:460 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/51b88818-5108-40db-90c8-4f2e7198959e/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:104 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8527f5cd-2992-44be-90b8-e9086cedf46e/volumes/kubernetes.io~projected/kube-api-access-qp9jf DeviceMajor:0 DeviceMinor:233 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/50cc5384c1b8bb903ef5215671baf6cb4c6d2ce7a00d389208992a278c3b103c/userdata/shm DeviceMajor:0 DeviceMinor:447 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-115 DeviceMajor:0 DeviceMinor:115 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:243 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-275 DeviceMajor:0 DeviceMinor:275 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3374940a-612d-4335-8236-3ffe8d6e73a5/volumes/kubernetes.io~projected/kube-api-access-kmpcn DeviceMajor:0 DeviceMinor:458 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-351 DeviceMajor:0 DeviceMinor:351 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-431 DeviceMajor:0 DeviceMinor:431 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ecba3762dd2d103496ed9fed52be51c550935d62b9dab4b76da7f92f8e0395b8/userdata/shm DeviceMajor:0 DeviceMinor:318 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-342 DeviceMajor:0 DeviceMinor:342 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dc65ec1f-b8fb-40d6-ac39-46b255a33221/volumes/kubernetes.io~projected/kube-api-access-ww85l DeviceMajor:0 DeviceMinor:365 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-65 DeviceMajor:0 DeviceMinor:65 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-381 DeviceMajor:0 DeviceMinor:381 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1d5e311c-1c6a-4d5d-8c2b-493025593934/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:675 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/c247d991-809e-46b6-9617-9b05007b7560/volumes/kubernetes.io~projected/kube-api-access-v4hqj DeviceMajor:0 DeviceMinor:216 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-599 DeviceMajor:0 DeviceMinor:599 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-467 DeviceMajor:0 DeviceMinor:467 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3e14d5393be022eff24be7e8d5e671dc610671f728796a2cb5a2309e1895b5f0/userdata/shm DeviceMajor:0 DeviceMinor:476 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-691 DeviceMajor:0 DeviceMinor:691 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2/volumes/kubernetes.io~projected/kube-api-access-jw2x6 DeviceMajor:0 DeviceMinor:726 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/e7fae040-28fa-4d97-8482-fd0dd12cc921/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1885558dee49f6f6ad4666eff4afb57c213620724cc5285f30bbd5409ae9582e/userdata/shm DeviceMajor:0 DeviceMinor:253 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-271 DeviceMajor:0 DeviceMinor:271 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5a51c701-7f2a-4332-a301-746e8a0eb475/volumes/kubernetes.io~projected/kube-api-access-g7ppn DeviceMajor:0 DeviceMinor:415 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-473 DeviceMajor:0 DeviceMinor:473 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8beda3a0-a653-4810-b3f2-d25badb21ab1/volumes/kubernetes.io~projected/kube-api-access-tgtgw DeviceMajor:0 DeviceMinor:238 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/259794ab-d027-497a-b08e-5a6d79057668/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:482 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/13072c08-c77c-4170-9ebe-98d63968747b/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:546 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/ff98fb1e-7a1f-4657-b085-743d6f2d28e2/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:627 Capacity:200003584 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-731 DeviceMajor:0 DeviceMinor:731 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-337 DeviceMajor:0 DeviceMinor:337 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5f0b7606a412dcca4dd370553910b12ad443e3587ee9a8d70a1100b889c51bbc/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-375 DeviceMajor:0 DeviceMinor:375 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2667f8abc3377f4e949ca5efee8caf3d44c08b3911b024266dc76fb9003cb2e0/userdata/shm DeviceMajor:0 DeviceMinor:442 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-471 DeviceMajor:0 DeviceMinor:471 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5001c0304645acfa799077998786ddfe7d90e702ba8e83ddc5ed0850af9bd30d/userdata/shm DeviceMajor:0 DeviceMinor:557 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-576 DeviceMajor:0 DeviceMinor:576 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d664acc4-ec4f-4078-ae93-404a14ea18fc/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:231 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/03d12dab-1215-4c1f-a9f5-27ea7174d308/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:507 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-578 DeviceMajor:0 DeviceMinor:578 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/44d9515c76d2b5369510d3737c2fe1814c5099a9199ebffb839eb4e657e0735e/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/41659a48-5eea-41cd-8b2a-b683dc15cc11/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-607 DeviceMajor:0 DeviceMinor:607 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/813a77628cdb690ef9ed760c21cb05d1f17fab6329f59eb55493fe5e4d55f0d3/userdata/shm DeviceMajor:0 DeviceMinor:75 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-288 DeviceMajor:0 DeviceMinor:288 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-514 DeviceMajor:0 DeviceMinor:514 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4d2c5580-36f6-4107-af53-cfbd15080b30/volumes/kubernetes.io~projected/kube-api-access-x6j2m DeviceMajor:0 DeviceMinor:54 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-93 DeviceMajor:0 DeviceMinor:93 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3b50118d-f7c2-4bff-aca0-5c6623819baf/volumes/kubernetes.io~projected/kube-api-access-6rqsq DeviceMajor:0 DeviceMinor:237 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/56e11aac-d199-404a-a0e2-82c28926746d/volumes/kubernetes.io~projected/kube-api-access-pg4cn DeviceMajor:0 DeviceMinor:345 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7cd4cfeb35d9cb7e8bc213abe4e5f2a9ecc0b4807e7e9244214faaeba9632ab5/userdata/shm DeviceMajor:0 DeviceMinor:644 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8f416a5a9dd7c863825858501cc1e0dcef058160b507e9a5e5d82fab9e9dd0c1/userdata/shm DeviceMajor:0 DeviceMinor:105 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-166 DeviceMajor:0 DeviceMinor:166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d9eb3750-cb7b-4d3c-88bc-d1b68a370872/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/9a6c1523-e77c-4aac-814c-05d41215c42f/volumes/kubernetes.io~projected/kube-api-access-m6tp5 DeviceMajor:0 DeviceMinor:239 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/a3dddb56-d180-4b8a-85bd-77c3888d8f71/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:377 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-71 DeviceMajor:0 DeviceMinor:71 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/58ea8fcc-29b2-48ef-8629-2ba217c9d70c/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:139 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/310348963a49f41d34871a4c0d732a2191aaea2d2db0ebbe19d1390098835ced/userdata/shm DeviceMajor:0 DeviceMinor:259 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-694 DeviceMajor:0 DeviceMinor:694 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-687 DeviceMajor:0 DeviceMinor:687 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-80 DeviceMajor:0 DeviceMinor:80 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a3dddb56-d180-4b8a-85bd-77c3888d8f71/volumes/kubernetes.io~projected/kube-api-access-nxbdq DeviceMajor:0 DeviceMinor:378 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/3374940a-612d-4335-8236-3ffe8d6e73a5/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:453 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-478 DeviceMajor:0 DeviceMinor:478 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-305 DeviceMajor:0 DeviceMinor:305 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0bce9154-cd31-4c4a-9d86-2903d5b1adad/volumes/kubernetes.io~projected/kube-api-access-4kr8w DeviceMajor:0 DeviceMinor:55 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/4f65184f-8fc2-4656-8776-a3b962aa1f5d/volumes/kubernetes.io~projected/kube-api-access-j65pb DeviceMajor:0 DeviceMinor:241 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/5a51c701-7f2a-4332-a301-746e8a0eb475/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:413 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/c5966fa8-b9f0-42ee-a75b-20014782366d/volumes/kubernetes.io~projected/kube-api-access-v6sr8 DeviceMajor:0 DeviceMinor:87 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/915745b7520cddc649a8755f21f9aee4e835c7048a1ccbf6eea3461a98982a5e/userdata/shm DeviceMajor:0 DeviceMinor:91 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d664acc4-ec4f-4078-ae93-404a14ea18fc/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:222 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/kubelet/pods/a1098584-43b9-4f2c-83d2-22d95fb7b0c3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:224 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d9034c9252b2dbe49fa20bf241af605c2b9efd4ec2d903f7338b331b9a335a60/userdata/shm DeviceMajor:0 DeviceMinor:550 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-591 DeviceMajor:0 DeviceMinor:591 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/157e3524-eb27-41ca-b49d-2697ee1245ca/volumes/kubernetes.io~projected/kube-api-access-qhzsr DeviceMajor:0 DeviceMinor:103 Capacity:32475516928 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/72c2a8691354c4c557c4f66cfa7db93075f810bfaffc8fba5e2d6aab857f58a8/userdata/shm DeviceMajor:0 DeviceMinor:269 Capacity:67108864 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-118 DeviceMajor:0 DeviceMinor:118 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-330 DeviceMajor:0 DeviceMinor:330 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:0865b3dd8e414cf MacAddress:0a:5d:87:16:b6:fa Speed:10000 Mtu:8900} {Name:0ee32bb670dc765 MacAddress:ae:dc:e5:43:41:14 Speed:10000 Mtu:8900} {Name:1885558dee49f6f MacAddress:3e:9b:ec:ce:7c:67 Speed:10000 Mtu:8900} {Name:1c57ea8e09d1325 MacAddress:ae:57:01:f8:2f:9d Speed:10000 Mtu:8900} {Name:202f921c55b7597 MacAddress:76:b8:15:c9:43:45 Speed:10000 Mtu:8900} {Name:2667f8abc3377f4 MacAddress:52:c8:4d:19:f3:37 Speed:10000 Mtu:8900} {Name:2702ffe2a709651 MacAddress:a6:d1:26:4a:94:a9 Speed:10000 Mtu:8900} {Name:2c994266a2d02f0 MacAddress:16:91:ad:03:8c:f6 Speed:10000 Mtu:8900} {Name:2e855eb091c3c19 MacAddress:2a:c5:86:18:ac:ce Speed:10000 Mtu:8900} {Name:310348963a49f41 MacAddress:a6:b9:02:2b:e2:b2 Speed:10000 Mtu:8900} {Name:3d5b3f08e9980af MacAddress:8e:67:aa:e4:0d:5e Speed:10000 Mtu:8900} {Name:3e14d5393be022e MacAddress:6e:9c:b8:6f:8e:56 Speed:10000 Mtu:8900} {Name:3eefb4e7e53cb0d MacAddress:4e:c5:5e:40:a3:e7 Speed:10000 Mtu:8900} {Name:4e6b0fbcf10efb9 MacAddress:46:fc:7f:39:e8:71 Speed:10000 Mtu:8900} {Name:5001c0304645acf MacAddress:d2:32:ab:42:f0:5e Speed:10000 Mtu:8900} {Name:50cc5384c1b8bb9 MacAddress:fa:f9:8e:d1:15:d9 Speed:10000 Mtu:8900} {Name:58a6496fefda9dc MacAddress:b6:48:43:cc:af:d7 Speed:10000 Mtu:8900} {Name:68534953ce4f9d6 MacAddress:c6:f7:73:b7:c3:b5 Speed:10000 Mtu:8900} {Name:6bf8f167b730f8b MacAddress:22:ae:fd:6a:f2:a2 Speed:10000 Mtu:8900} {Name:813a77628cdb690 MacAddress:e6:92:26:5f:ea:fb Speed:10000 Mtu:8900} {Name:82537ef54e6382f MacAddress:82:36:a0:05:83:05 Speed:10000 Mtu:8900} {Name:915745b7520cddc MacAddress:26:2b:2c:73:31:50 Speed:10000 Mtu:8900} {Name:944648f39111fd7 MacAddress:56:2c:ed:07:37:b9 Speed:10000 Mtu:8900} {Name:963f71e764d0468 MacAddress:7e:04:33:90:55:77 Speed:10000 Mtu:8900} {Name:9c83b15dba69144 MacAddress:96:61:a8:df:af:5b Speed:10000 Mtu:8900} {Name:9e6502d00c4d560 MacAddress:7e:4e:f7:6a:e8:0c Speed:10000 Mtu:8900} {Name:a68ad4116cab887 MacAddress:e2:a1:8f:7b:4f:cc Speed:10000 Mtu:8900} {Name:b34e9a2b3355632 MacAddress:e2:14:54:df:01:13 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:32:ab:c8:1a:45:70 Speed:0 Mtu:8900} {Name:c09069a2109d448 MacAddress:fe:db:b9:18:ab:35 Speed:10000 Mtu:8900} {Name:c604e07b23c824f MacAddress:06:3f:bb:0e:1d:a9 Speed:10000 Mtu:8900} {Name:c8885dfa43b9e4c MacAddress:9a:6d:d4:70:ac:82 Speed:10000 Mtu:8900} {Name:d146d705aef07c6 MacAddress:26:9c:a9:d8:01:99 Speed:10000 Mtu:8900} {Name:d9034c9252b2dbe MacAddress:ea:a7:32:32:33:ff Speed:10000 Mtu:8900} {Name:db26592d73a7736 MacAddress:86:cb:1f:ea:17:4d Speed:10000 Mtu:8900} {Name:e3a470e3bacc4ee MacAddress:d2:33:cb:b5:3c:c8 Speed:10000 Mtu:8900} {Name:ecba3762dd2d103 MacAddress:aa:f8:eb:8b:fc:9f Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:5a:31:1f Speed:-1 Mtu:9000} {Name:f1b224e2c3a94ac MacAddress:36:22:01:06:f5:cf Speed:10000 Mtu:8900} {Name:f2b06f4e36c6672 MacAddress:1a:09:c3:7a:d7:72 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:3a:39:e3:03:14:35 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[ Mar 19 09:23:34.782216 master-0 kubenswrapper[13205]: 0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 09:23:34.782216 master-0 kubenswrapper[13205]: I0319 09:23:34.781622 13205 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 09:23:34.782216 master-0 kubenswrapper[13205]: I0319 09:23:34.781813 13205 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 09:23:34.782216 master-0 kubenswrapper[13205]: I0319 09:23:34.782210 13205 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 09:23:34.782416 master-0 kubenswrapper[13205]: I0319 09:23:34.782367 13205 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 09:23:34.782598 master-0 kubenswrapper[13205]: I0319 09:23:34.782406 13205 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 09:23:34.782643 master-0 kubenswrapper[13205]: I0319 09:23:34.782616 13205 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 09:23:34.782643 master-0 kubenswrapper[13205]: I0319 09:23:34.782627 13205 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 09:23:34.782643 master-0 kubenswrapper[13205]: I0319 09:23:34.782635 13205 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:23:34.782716 master-0 kubenswrapper[13205]: I0319 09:23:34.782657 13205 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:23:34.782836 master-0 kubenswrapper[13205]: I0319 09:23:34.782811 13205 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:23:34.783195 master-0 kubenswrapper[13205]: I0319 09:23:34.783166 13205 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 09:23:34.783254 master-0 kubenswrapper[13205]: I0319 09:23:34.783240 13205 kubelet.go:418] "Attempting to sync node with API server" Mar 19 09:23:34.783296 master-0 kubenswrapper[13205]: I0319 09:23:34.783256 13205 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 09:23:34.783341 master-0 kubenswrapper[13205]: I0319 09:23:34.783307 13205 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 09:23:34.783341 master-0 kubenswrapper[13205]: I0319 09:23:34.783324 13205 kubelet.go:324] "Adding apiserver pod source" Mar 19 09:23:34.785068 master-0 kubenswrapper[13205]: W0319 09:23:34.785004 13205 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:23:34.785141 master-0 kubenswrapper[13205]: E0319 09:23:34.785071 13205 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:23:34.789593 master-0 kubenswrapper[13205]: I0319 09:23:34.789550 13205 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 09:23:34.790905 master-0 kubenswrapper[13205]: I0319 09:23:34.790869 13205 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 19 09:23:34.791086 master-0 kubenswrapper[13205]: W0319 09:23:34.791027 13205 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:23:34.791122 master-0 kubenswrapper[13205]: I0319 09:23:34.791100 13205 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 09:23:34.791122 master-0 kubenswrapper[13205]: E0319 09:23:34.791100 13205 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:23:34.792262 master-0 kubenswrapper[13205]: I0319 09:23:34.792223 13205 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 09:23:34.792398 master-0 kubenswrapper[13205]: I0319 09:23:34.792357 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 09:23:34.792398 master-0 kubenswrapper[13205]: I0319 09:23:34.792387 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 09:23:34.792398 master-0 kubenswrapper[13205]: I0319 09:23:34.792396 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 09:23:34.792514 master-0 kubenswrapper[13205]: I0319 09:23:34.792406 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 09:23:34.792514 master-0 kubenswrapper[13205]: I0319 09:23:34.792414 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 09:23:34.792514 master-0 kubenswrapper[13205]: I0319 09:23:34.792424 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 09:23:34.792514 master-0 kubenswrapper[13205]: I0319 09:23:34.792432 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 09:23:34.792514 master-0 kubenswrapper[13205]: I0319 09:23:34.792439 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 09:23:34.792514 master-0 kubenswrapper[13205]: I0319 09:23:34.792450 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 09:23:34.792514 master-0 kubenswrapper[13205]: I0319 09:23:34.792458 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 09:23:34.792514 master-0 kubenswrapper[13205]: I0319 09:23:34.792469 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 09:23:34.792514 master-0 kubenswrapper[13205]: I0319 09:23:34.792483 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 09:23:34.792514 master-0 kubenswrapper[13205]: I0319 09:23:34.792517 13205 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 09:23:34.793043 master-0 kubenswrapper[13205]: I0319 09:23:34.792999 13205 server.go:1280] "Started kubelet" Mar 19 09:23:34.793195 master-0 kubenswrapper[13205]: I0319 09:23:34.793148 13205 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:23:34.793234 master-0 kubenswrapper[13205]: I0319 09:23:34.793153 13205 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 09:23:34.793715 master-0 kubenswrapper[13205]: I0319 09:23:34.793326 13205 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 09:23:34.795268 master-0 kubenswrapper[13205]: I0319 09:23:34.795193 13205 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 19 09:23:34.795401 master-0 kubenswrapper[13205]: E0319 09:23:34.793737 13205 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e33c2c17616e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:23:34.79296791 +0000 UTC m=+0.125274798,LastTimestamp:2026-03-19 09:23:34.79296791 +0000 UTC m=+0.125274798,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:23:34.796052 master-0 kubenswrapper[13205]: I0319 09:23:34.795969 13205 server.go:449] "Adding debug handlers to kubelet server" Mar 19 09:23:34.796256 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 19 09:23:34.797923 master-0 kubenswrapper[13205]: I0319 09:23:34.797884 13205 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 09:23:34.803189 master-0 kubenswrapper[13205]: I0319 09:23:34.803117 13205 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 09:23:34.803279 master-0 kubenswrapper[13205]: I0319 09:23:34.803198 13205 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 09:23:34.804246 master-0 kubenswrapper[13205]: I0319 09:23:34.804174 13205 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 09:08:48 +0000 UTC, rotation deadline is 2026-03-20 02:50:59.284674742 +0000 UTC Mar 19 09:23:34.804246 master-0 kubenswrapper[13205]: I0319 09:23:34.804232 13205 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h27m24.480445371s for next certificate rotation Mar 19 09:23:34.806767 master-0 kubenswrapper[13205]: I0319 09:23:34.806737 13205 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 09:23:34.806767 master-0 kubenswrapper[13205]: I0319 09:23:34.806759 13205 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 09:23:34.806877 master-0 kubenswrapper[13205]: I0319 09:23:34.806851 13205 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 19 09:23:34.807099 master-0 kubenswrapper[13205]: E0319 09:23:34.807066 13205 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:23:34.812768 master-0 kubenswrapper[13205]: E0319 09:23:34.812649 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 09:23:34.813005 master-0 kubenswrapper[13205]: W0319 09:23:34.812914 13205 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:23:34.813061 master-0 kubenswrapper[13205]: E0319 09:23:34.813022 13205 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:23:34.814452 master-0 kubenswrapper[13205]: I0319 09:23:34.814407 13205 factory.go:153] Registering CRI-O factory Mar 19 09:23:34.814520 master-0 kubenswrapper[13205]: I0319 09:23:34.814484 13205 factory.go:221] Registration of the crio container factory successfully Mar 19 09:23:34.814520 master-0 kubenswrapper[13205]: I0319 09:23:34.814459 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b50118d-f7c2-4bff-aca0-5c6623819baf" volumeName="kubernetes.io/secret/3b50118d-f7c2-4bff-aca0-5c6623819baf-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.814520 master-0 kubenswrapper[13205]: I0319 09:23:34.814518 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b29cb7b-26d2-4fab-9e03-2d7fdf937592" volumeName="kubernetes.io/projected/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-kube-api-access-8hw6b" seLinuxMountContext="" Mar 19 09:23:34.814630 master-0 kubenswrapper[13205]: I0319 09:23:34.814552 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9076d131-644a-4332-8a70-34f6b0f71575" volumeName="kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert" seLinuxMountContext="" Mar 19 09:23:34.814630 master-0 kubenswrapper[13205]: I0319 09:23:34.814564 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="979d4d12-a560-4309-a1d3-cbebe853e8ea" volumeName="kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-binary-copy" seLinuxMountContext="" Mar 19 09:23:34.814630 master-0 kubenswrapper[13205]: I0319 09:23:34.814578 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" volumeName="kubernetes.io/projected/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-kube-api-access-lvnb9" seLinuxMountContext="" Mar 19 09:23:34.814630 master-0 kubenswrapper[13205]: I0319 09:23:34.814590 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d5e311c-1c6a-4d5d-8c2b-493025593934" volumeName="kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-client-ca" seLinuxMountContext="" Mar 19 09:23:34.814630 master-0 kubenswrapper[13205]: I0319 09:23:34.814602 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b50118d-f7c2-4bff-aca0-5c6623819baf" volumeName="kubernetes.io/empty-dir/3b50118d-f7c2-4bff-aca0-5c6623819baf-operand-assets" seLinuxMountContext="" Mar 19 09:23:34.814630 master-0 kubenswrapper[13205]: I0319 09:23:34.814609 13205 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 09:23:34.814630 master-0 kubenswrapper[13205]: I0319 09:23:34.814616 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51b88818-5108-40db-90c8-4f2e7198959e" volumeName="kubernetes.io/projected/51b88818-5108-40db-90c8-4f2e7198959e-kube-api-access" seLinuxMountContext="" Mar 19 09:23:34.814630 master-0 kubenswrapper[13205]: I0319 09:23:34.814621 13205 factory.go:55] Registering systemd factory Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814631 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64f60856-22dd-4560-acff-c620e17844a1" volumeName="kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-encryption-config" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814667 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b333a1e-2a7f-423a-8b40-99f30c89f740" volumeName="kubernetes.io/projected/3b333a1e-2a7f-423a-8b40-99f30c89f740-kube-api-access-xvd6f" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814679 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="741c9d25-7634-41c0-bfe4-b7a15de4b341" volumeName="kubernetes.io/empty-dir/741c9d25-7634-41c0-bfe4-b7a15de4b341-catalog-content" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814690 13205 factory.go:221] Registration of the systemd container factory successfully Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814695 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9076d131-644a-4332-8a70-34f6b0f71575" volumeName="kubernetes.io/configmap/9076d131-644a-4332-8a70-34f6b0f71575-trusted-ca" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814717 13205 factory.go:103] Registering Raw factory Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814742 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ff98fb1e-7a1f-4657-b085-743d6f2d28e2" volumeName="kubernetes.io/projected/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kube-api-access" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814766 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7fae040-28fa-4d97-8482-fd0dd12cc921" volumeName="kubernetes.io/secret/e7fae040-28fa-4d97-8482-fd0dd12cc921-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814778 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8beda3a0-a653-4810-b3f2-d25badb21ab1" volumeName="kubernetes.io/projected/8beda3a0-a653-4810-b3f2-d25badb21ab1-kube-api-access-tgtgw" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814792 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9076d131-644a-4332-8a70-34f6b0f71575" volumeName="kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814807 13205 manager.go:1196] Started watching for new ooms in manager Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814813 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" volumeName="kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-ca" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814824 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2" volumeName="kubernetes.io/empty-dir/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-catalog-content" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814839 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0bce9154-cd31-4c4a-9d86-2903d5b1adad" volumeName="kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-catalog-content" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814866 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58ea8fcc-29b2-48ef-8629-2ba217c9d70c" volumeName="kubernetes.io/projected/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-kube-api-access-sfq74" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814879 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="741c9d25-7634-41c0-bfe4-b7a15de4b341" volumeName="kubernetes.io/empty-dir/741c9d25-7634-41c0-bfe4-b7a15de4b341-utilities" seLinuxMountContext="" Mar 19 09:23:34.814875 master-0 kubenswrapper[13205]: I0319 09:23:34.814890 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8527f5cd-2992-44be-90b8-e9086cedf46e" volumeName="kubernetes.io/secret/8527f5cd-2992-44be-90b8-e9086cedf46e-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.814901 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0c75102-6790-4ed3-84da-61c3611186f8" volumeName="kubernetes.io/secret/f0c75102-6790-4ed3-84da-61c3611186f8-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.814915 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf5dde46-8a95-46a6-bee5-20d3a58f33ee" volumeName="kubernetes.io/empty-dir/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-catalog-content" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.814925 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5966fa8-b9f0-42ee-a75b-20014782366d" volumeName="kubernetes.io/empty-dir/c5966fa8-b9f0-42ee-a75b-20014782366d-catalog-content" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.814935 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e" volumeName="kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.814950 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7fae040-28fa-4d97-8482-fd0dd12cc921" volumeName="kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-service-ca-bundle" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.814964 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c247d991-809e-46b6-9617-9b05007b7560" volumeName="kubernetes.io/configmap/c247d991-809e-46b6-9617-9b05007b7560-trusted-ca" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.814974 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb" volumeName="kubernetes.io/projected/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-kube-api-access" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.814989 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0bce9154-cd31-4c4a-9d86-2903d5b1adad" volumeName="kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-utilities" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815004 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d5e311c-1c6a-4d5d-8c2b-493025593934" volumeName="kubernetes.io/secret/1d5e311c-1c6a-4d5d-8c2b-493025593934-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815017 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58ea8fcc-29b2-48ef-8629-2ba217c9d70c" volumeName="kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815027 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" volumeName="kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-client" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815042 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f585ebb1-6210-463b-af85-fb29e1e7dfa5" volumeName="kubernetes.io/projected/f585ebb1-6210-463b-af85-fb29e1e7dfa5-kube-api-access-5g4rw" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815068 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1694c93a-9acb-4bec-bfd6-3ec370e7a0b4" volumeName="kubernetes.io/projected/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-kube-api-access-9blbc" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815080 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="41659a48-5eea-41cd-8b2a-b683dc15cc11" volumeName="kubernetes.io/projected/41659a48-5eea-41cd-8b2a-b683dc15cc11-kube-api-access-jtw68" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815091 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58ea8fcc-29b2-48ef-8629-2ba217c9d70c" volumeName="kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-env-overrides" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815106 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a51c701-7f2a-4332-a301-746e8a0eb475" volumeName="kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-etcd-client" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815122 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="259794ab-d027-497a-b08e-5a6d79057668" volumeName="kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815140 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64f60856-22dd-4560-acff-c620e17844a1" volumeName="kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-etcd-client" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815152 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c8ee765-76b8-4cde-8acb-6e5edd1b8149" volumeName="kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815166 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="67d66357-fcee-4e70-b563-5895b978ab55" volumeName="kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-client-ca" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815189 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5966fa8-b9f0-42ee-a75b-20014782366d" volumeName="kubernetes.io/projected/c5966fa8-b9f0-42ee-a75b-20014782366d-kube-api-access-v6sr8" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815201 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0c75102-6790-4ed3-84da-61c3611186f8" volumeName="kubernetes.io/configmap/f0c75102-6790-4ed3-84da-61c3611186f8-config" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815231 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03d12dab-1215-4c1f-a9f5-27ea7174d308" volumeName="kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-kube-api-access-6bdnt" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815243 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0bce9154-cd31-4c4a-9d86-2903d5b1adad" volumeName="kubernetes.io/projected/0bce9154-cd31-4c4a-9d86-2903d5b1adad-kube-api-access-4kr8w" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815253 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43cb2a3b-40e2-45ee-894a-6c833ee17efd" volumeName="kubernetes.io/configmap/43cb2a3b-40e2-45ee-894a-6c833ee17efd-config" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815266 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d2c5580-36f6-4107-af53-cfbd15080b30" volumeName="kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-catalog-content" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815278 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="41659a48-5eea-41cd-8b2a-b683dc15cc11" volumeName="kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovnkube-config" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815291 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51b88818-5108-40db-90c8-4f2e7198959e" volumeName="kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815303 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" volumeName="kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-env-overrides" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815317 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3374940a-612d-4335-8236-3ffe8d6e73a5" volumeName="kubernetes.io/projected/3374940a-612d-4335-8236-3ffe8d6e73a5-ca-certs" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815341 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64f60856-22dd-4560-acff-c620e17844a1" volumeName="kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-etcd-serving-ca" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815358 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3dddb56-d180-4b8a-85bd-77c3888d8f71" volumeName="kubernetes.io/projected/a3dddb56-d180-4b8a-85bd-77c3888d8f71-kube-api-access-nxbdq" seLinuxMountContext="" Mar 19 09:23:34.815346 master-0 kubenswrapper[13205]: I0319 09:23:34.815374 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="157e3524-eb27-41ca-b49d-2697ee1245ca" volumeName="kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-cni-binary-copy" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815398 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a51c701-7f2a-4332-a301-746e8a0eb475" volumeName="kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-audit-policies" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815412 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a51c701-7f2a-4332-a301-746e8a0eb475" volumeName="kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815423 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5966fa8-b9f0-42ee-a75b-20014782366d" volumeName="kubernetes.io/empty-dir/c5966fa8-b9f0-42ee-a75b-20014782366d-utilities" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815437 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a51c701-7f2a-4332-a301-746e8a0eb475" volumeName="kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815448 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c8ee765-76b8-4cde-8acb-6e5edd1b8149" volumeName="kubernetes.io/projected/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-kube-api-access-djxfs" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815458 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d664acc4-ec4f-4078-ae93-404a14ea18fc" volumeName="kubernetes.io/secret/d664acc4-ec4f-4078-ae93-404a14ea18fc-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815472 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03d12dab-1215-4c1f-a9f5-27ea7174d308" volumeName="kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815483 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10c609bb-136a-4ce2-b9e2-0a03e1a37a62" volumeName="kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815496 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16d2930b-486b-492d-983e-c6702d8f53a7" volumeName="kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815506 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4abcf2ea-50f5-4d62-8a23-583438e5b451" volumeName="kubernetes.io/secret/4abcf2ea-50f5-4d62-8a23-583438e5b451-metrics-tls" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815516 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e" volumeName="kubernetes.io/projected/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-kube-api-access-w5f5s" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815559 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8527f5cd-2992-44be-90b8-e9086cedf46e" volumeName="kubernetes.io/projected/8527f5cd-2992-44be-90b8-e9086cedf46e-kube-api-access-qp9jf" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815592 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="979d4d12-a560-4309-a1d3-cbebe853e8ea" volumeName="kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815606 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" volumeName="kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815620 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c247d991-809e-46b6-9617-9b05007b7560" volumeName="kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815630 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d5e311c-1c6a-4d5d-8c2b-493025593934" volumeName="kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-config" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815642 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7fae040-28fa-4d97-8482-fd0dd12cc921" volumeName="kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-config" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815706 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4f65184f-8fc2-4656-8776-a3b962aa1f5d" volumeName="kubernetes.io/configmap/4f65184f-8fc2-4656-8776-a3b962aa1f5d-iptables-alerter-script" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815785 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b29cb7b-26d2-4fab-9e03-2d7fdf937592" volumeName="kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815804 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf5dde46-8a95-46a6-bee5-20d3a58f33ee" volumeName="kubernetes.io/empty-dir/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-utilities" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815828 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb" volumeName="kubernetes.io/secret/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815853 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3374940a-612d-4335-8236-3ffe8d6e73a5" volumeName="kubernetes.io/projected/3374940a-612d-4335-8236-3ffe8d6e73a5-kube-api-access-kmpcn" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815881 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3374940a-612d-4335-8236-3ffe8d6e73a5" volumeName="kubernetes.io/secret/3374940a-612d-4335-8236-3ffe8d6e73a5-catalogserver-certs" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815903 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d2c5580-36f6-4107-af53-cfbd15080b30" volumeName="kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-utilities" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815916 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64f60856-22dd-4560-acff-c620e17844a1" volumeName="kubernetes.io/projected/64f60856-22dd-4560-acff-c620e17844a1-kube-api-access-cf5jl" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815931 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51b88818-5108-40db-90c8-4f2e7198959e" volumeName="kubernetes.io/configmap/51b88818-5108-40db-90c8-4f2e7198959e-service-ca" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815950 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64f60856-22dd-4560-acff-c620e17844a1" volumeName="kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-audit" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815964 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="67d66357-fcee-4e70-b563-5895b978ab55" volumeName="kubernetes.io/projected/67d66357-fcee-4e70-b563-5895b978ab55-kube-api-access-sclqq" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815984 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" volumeName="kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-service-ca" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.815997 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9a6c1523-e77c-4aac-814c-05d41215c42f" volumeName="kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.816011 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d664acc4-ec4f-4078-ae93-404a14ea18fc" volumeName="kubernetes.io/projected/d664acc4-ec4f-4078-ae93-404a14ea18fc-kube-api-access" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.816035 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" volumeName="kubernetes.io/secret/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.816064 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="13072c08-c77c-4170-9ebe-98d63968747b" volumeName="kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.816081 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="41659a48-5eea-41cd-8b2a-b683dc15cc11" volumeName="kubernetes.io/secret/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.816105 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8beda3a0-a653-4810-b3f2-d25badb21ab1" volumeName="kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs" seLinuxMountContext="" Mar 19 09:23:34.816171 master-0 kubenswrapper[13205]: I0319 09:23:34.816120 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9a6c1523-e77c-4aac-814c-05d41215c42f" volumeName="kubernetes.io/projected/9a6c1523-e77c-4aac-814c-05d41215c42f-kube-api-access-m6tp5" seLinuxMountContext="" Mar 19 09:23:34.817060 master-0 kubenswrapper[13205]: I0319 09:23:34.816597 13205 manager.go:319] Starting recovery of all containers Mar 19 09:23:34.817495 master-0 kubenswrapper[13205]: I0319 09:23:34.816140 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="41659a48-5eea-41cd-8b2a-b683dc15cc11" volumeName="kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-env-overrides" seLinuxMountContext="" Mar 19 09:23:34.817590 master-0 kubenswrapper[13205]: I0319 09:23:34.817519 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9076d131-644a-4332-8a70-34f6b0f71575" volumeName="kubernetes.io/projected/9076d131-644a-4332-8a70-34f6b0f71575-kube-api-access-2vcf6" seLinuxMountContext="" Mar 19 09:23:34.817627 master-0 kubenswrapper[13205]: I0319 09:23:34.817590 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c247d991-809e-46b6-9617-9b05007b7560" volumeName="kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-kube-api-access-v4hqj" seLinuxMountContext="" Mar 19 09:23:34.817627 master-0 kubenswrapper[13205]: I0319 09:23:34.817620 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb" volumeName="kubernetes.io/configmap/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-config" seLinuxMountContext="" Mar 19 09:23:34.817683 master-0 kubenswrapper[13205]: I0319 09:23:34.817639 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a51c701-7f2a-4332-a301-746e8a0eb475" volumeName="kubernetes.io/projected/5a51c701-7f2a-4332-a301-746e8a0eb475-kube-api-access-g7ppn" seLinuxMountContext="" Mar 19 09:23:34.817683 master-0 kubenswrapper[13205]: I0319 09:23:34.817656 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64f60856-22dd-4560-acff-c620e17844a1" volumeName="kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-config" seLinuxMountContext="" Mar 19 09:23:34.817683 master-0 kubenswrapper[13205]: I0319 09:23:34.817670 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf5dde46-8a95-46a6-bee5-20d3a58f33ee" volumeName="kubernetes.io/projected/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-kube-api-access-6hxq7" seLinuxMountContext="" Mar 19 09:23:34.817759 master-0 kubenswrapper[13205]: I0319 09:23:34.817686 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e" volumeName="kubernetes.io/configmap/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 09:23:34.817759 master-0 kubenswrapper[13205]: I0319 09:23:34.817707 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="979d4d12-a560-4309-a1d3-cbebe853e8ea" volumeName="kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 19 09:23:34.817759 master-0 kubenswrapper[13205]: I0319 09:23:34.817722 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2" volumeName="kubernetes.io/empty-dir/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-utilities" seLinuxMountContext="" Mar 19 09:23:34.817759 master-0 kubenswrapper[13205]: I0319 09:23:34.817736 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7fae040-28fa-4d97-8482-fd0dd12cc921" volumeName="kubernetes.io/projected/e7fae040-28fa-4d97-8482-fd0dd12cc921-kube-api-access-jqwbw" seLinuxMountContext="" Mar 19 09:23:34.817759 master-0 kubenswrapper[13205]: I0319 09:23:34.817749 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="13072c08-c77c-4170-9ebe-98d63968747b" volumeName="kubernetes.io/projected/13072c08-c77c-4170-9ebe-98d63968747b-kube-api-access-clpb5" seLinuxMountContext="" Mar 19 09:23:34.817883 master-0 kubenswrapper[13205]: I0319 09:23:34.817762 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16d2930b-486b-492d-983e-c6702d8f53a7" volumeName="kubernetes.io/projected/16d2930b-486b-492d-983e-c6702d8f53a7-kube-api-access-h5hk6" seLinuxMountContext="" Mar 19 09:23:34.817883 master-0 kubenswrapper[13205]: I0319 09:23:34.817804 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a51c701-7f2a-4332-a301-746e8a0eb475" volumeName="kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-etcd-serving-ca" seLinuxMountContext="" Mar 19 09:23:34.817883 master-0 kubenswrapper[13205]: I0319 09:23:34.817827 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a51c701-7f2a-4332-a301-746e8a0eb475" volumeName="kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-encryption-config" seLinuxMountContext="" Mar 19 09:23:34.817883 master-0 kubenswrapper[13205]: I0319 09:23:34.817844 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" volumeName="kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-config" seLinuxMountContext="" Mar 19 09:23:34.817978 master-0 kubenswrapper[13205]: I0319 09:23:34.817921 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dc65ec1f-b8fb-40d6-ac39-46b255a33221" volumeName="kubernetes.io/projected/dc65ec1f-b8fb-40d6-ac39-46b255a33221-kube-api-access-ww85l" seLinuxMountContext="" Mar 19 09:23:34.817978 master-0 kubenswrapper[13205]: I0319 09:23:34.817967 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f585ebb1-6210-463b-af85-fb29e1e7dfa5" volumeName="kubernetes.io/projected/f585ebb1-6210-463b-af85-fb29e1e7dfa5-ca-certs" seLinuxMountContext="" Mar 19 09:23:34.818028 master-0 kubenswrapper[13205]: I0319 09:23:34.817985 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03d12dab-1215-4c1f-a9f5-27ea7174d308" volumeName="kubernetes.io/configmap/03d12dab-1215-4c1f-a9f5-27ea7174d308-trusted-ca" seLinuxMountContext="" Mar 19 09:23:34.818028 master-0 kubenswrapper[13205]: I0319 09:23:34.818001 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43cb2a3b-40e2-45ee-894a-6c833ee17efd" volumeName="kubernetes.io/projected/43cb2a3b-40e2-45ee-894a-6c833ee17efd-kube-api-access-vf6dq" seLinuxMountContext="" Mar 19 09:23:34.818028 master-0 kubenswrapper[13205]: I0319 09:23:34.818019 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d2c5580-36f6-4107-af53-cfbd15080b30" volumeName="kubernetes.io/projected/4d2c5580-36f6-4107-af53-cfbd15080b30-kube-api-access-x6j2m" seLinuxMountContext="" Mar 19 09:23:34.818104 master-0 kubenswrapper[13205]: I0319 09:23:34.818036 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4f65184f-8fc2-4656-8776-a3b962aa1f5d" volumeName="kubernetes.io/projected/4f65184f-8fc2-4656-8776-a3b962aa1f5d-kube-api-access-j65pb" seLinuxMountContext="" Mar 19 09:23:34.818104 master-0 kubenswrapper[13205]: I0319 09:23:34.818052 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d664acc4-ec4f-4078-ae93-404a14ea18fc" volumeName="kubernetes.io/configmap/d664acc4-ec4f-4078-ae93-404a14ea18fc-config" seLinuxMountContext="" Mar 19 09:23:34.818159 master-0 kubenswrapper[13205]: I0319 09:23:34.818112 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="157e3524-eb27-41ca-b49d-2697ee1245ca" volumeName="kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-daemon-config" seLinuxMountContext="" Mar 19 09:23:34.818159 master-0 kubenswrapper[13205]: I0319 09:23:34.818131 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d5e311c-1c6a-4d5d-8c2b-493025593934" volumeName="kubernetes.io/projected/1d5e311c-1c6a-4d5d-8c2b-493025593934-kube-api-access-49fpz" seLinuxMountContext="" Mar 19 09:23:34.818159 master-0 kubenswrapper[13205]: I0319 09:23:34.818146 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3374940a-612d-4335-8236-3ffe8d6e73a5" volumeName="kubernetes.io/empty-dir/3374940a-612d-4335-8236-3ffe8d6e73a5-cache" seLinuxMountContext="" Mar 19 09:23:34.818235 master-0 kubenswrapper[13205]: I0319 09:23:34.818162 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4abcf2ea-50f5-4d62-8a23-583438e5b451" volumeName="kubernetes.io/projected/4abcf2ea-50f5-4d62-8a23-583438e5b451-kube-api-access-2hnvh" seLinuxMountContext="" Mar 19 09:23:34.818235 master-0 kubenswrapper[13205]: I0319 09:23:34.818175 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64f60856-22dd-4560-acff-c620e17844a1" volumeName="kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-image-import-ca" seLinuxMountContext="" Mar 19 09:23:34.818235 master-0 kubenswrapper[13205]: I0319 09:23:34.818207 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e073eb4-67f2-4de7-8848-50da73079dbc" volumeName="kubernetes.io/projected/8e073eb4-67f2-4de7-8848-50da73079dbc-kube-api-access-9plst" seLinuxMountContext="" Mar 19 09:23:34.818235 master-0 kubenswrapper[13205]: I0319 09:23:34.818225 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c247d991-809e-46b6-9617-9b05007b7560" volumeName="kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-bound-sa-token" seLinuxMountContext="" Mar 19 09:23:34.818344 master-0 kubenswrapper[13205]: I0319 09:23:34.818239 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1694c93a-9acb-4bec-bfd6-3ec370e7a0b4" volumeName="kubernetes.io/configmap/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-config" seLinuxMountContext="" Mar 19 09:23:34.818344 master-0 kubenswrapper[13205]: I0319 09:23:34.818253 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b50118d-f7c2-4bff-aca0-5c6623819baf" volumeName="kubernetes.io/projected/3b50118d-f7c2-4bff-aca0-5c6623819baf-kube-api-access-6rqsq" seLinuxMountContext="" Mar 19 09:23:34.818344 master-0 kubenswrapper[13205]: I0319 09:23:34.818266 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="67d66357-fcee-4e70-b563-5895b978ab55" volumeName="kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-config" seLinuxMountContext="" Mar 19 09:23:34.818344 master-0 kubenswrapper[13205]: I0319 09:23:34.818277 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="67d66357-fcee-4e70-b563-5895b978ab55" volumeName="kubernetes.io/secret/67d66357-fcee-4e70-b563-5895b978ab55-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.818344 master-0 kubenswrapper[13205]: I0319 09:23:34.818293 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="259794ab-d027-497a-b08e-5a6d79057668" volumeName="kubernetes.io/projected/259794ab-d027-497a-b08e-5a6d79057668-kube-api-access-6v88k" seLinuxMountContext="" Mar 19 09:23:34.818344 master-0 kubenswrapper[13205]: I0319 09:23:34.818323 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0c75102-6790-4ed3-84da-61c3611186f8" volumeName="kubernetes.io/projected/f0c75102-6790-4ed3-84da-61c3611186f8-kube-api-access" seLinuxMountContext="" Mar 19 09:23:34.818344 master-0 kubenswrapper[13205]: I0319 09:23:34.818336 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" volumeName="kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-script-lib" seLinuxMountContext="" Mar 19 09:23:34.818515 master-0 kubenswrapper[13205]: I0319 09:23:34.818354 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b333a1e-2a7f-423a-8b40-99f30c89f740" volumeName="kubernetes.io/secret/3b333a1e-2a7f-423a-8b40-99f30c89f740-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.818515 master-0 kubenswrapper[13205]: I0319 09:23:34.818372 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58ea8fcc-29b2-48ef-8629-2ba217c9d70c" volumeName="kubernetes.io/secret/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-webhook-cert" seLinuxMountContext="" Mar 19 09:23:34.818515 master-0 kubenswrapper[13205]: I0319 09:23:34.818386 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64f60856-22dd-4560-acff-c620e17844a1" volumeName="kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:23:34.818515 master-0 kubenswrapper[13205]: I0319 09:23:34.818399 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="741c9d25-7634-41c0-bfe4-b7a15de4b341" volumeName="kubernetes.io/projected/741c9d25-7634-41c0-bfe4-b7a15de4b341-kube-api-access-4w7jx" seLinuxMountContext="" Mar 19 09:23:34.818515 master-0 kubenswrapper[13205]: I0319 09:23:34.818411 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64f60856-22dd-4560-acff-c620e17844a1" volumeName="kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.818515 master-0 kubenswrapper[13205]: I0319 09:23:34.818425 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" volumeName="kubernetes.io/projected/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-kube-api-access-vl7t5" seLinuxMountContext="" Mar 19 09:23:34.818515 master-0 kubenswrapper[13205]: I0319 09:23:34.818441 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3dddb56-d180-4b8a-85bd-77c3888d8f71" volumeName="kubernetes.io/configmap/a3dddb56-d180-4b8a-85bd-77c3888d8f71-signing-cabundle" seLinuxMountContext="" Mar 19 09:23:34.818515 master-0 kubenswrapper[13205]: I0319 09:23:34.818454 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3dddb56-d180-4b8a-85bd-77c3888d8f71" volumeName="kubernetes.io/secret/a3dddb56-d180-4b8a-85bd-77c3888d8f71-signing-key" seLinuxMountContext="" Mar 19 09:23:34.818515 master-0 kubenswrapper[13205]: I0319 09:23:34.818471 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1694c93a-9acb-4bec-bfd6-3ec370e7a0b4" volumeName="kubernetes.io/secret/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.818515 master-0 kubenswrapper[13205]: I0319 09:23:34.818487 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b333a1e-2a7f-423a-8b40-99f30c89f740" volumeName="kubernetes.io/configmap/3b333a1e-2a7f-423a-8b40-99f30c89f740-config" seLinuxMountContext="" Mar 19 09:23:34.818515 master-0 kubenswrapper[13205]: I0319 09:23:34.818499 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43cb2a3b-40e2-45ee-894a-6c833ee17efd" volumeName="kubernetes.io/secret/43cb2a3b-40e2-45ee-894a-6c833ee17efd-serving-cert" seLinuxMountContext="" Mar 19 09:23:34.818515 master-0 kubenswrapper[13205]: I0319 09:23:34.818514 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="56e11aac-d199-404a-a0e2-82c28926746d" volumeName="kubernetes.io/projected/56e11aac-d199-404a-a0e2-82c28926746d-kube-api-access-pg4cn" seLinuxMountContext="" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: I0319 09:23:34.818550 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2" volumeName="kubernetes.io/projected/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-kube-api-access-jw2x6" seLinuxMountContext="" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: I0319 09:23:34.818566 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f585ebb1-6210-463b-af85-fb29e1e7dfa5" volumeName="kubernetes.io/empty-dir/f585ebb1-6210-463b-af85-fb29e1e7dfa5-cache" seLinuxMountContext="" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: I0319 09:23:34.818581 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7fae040-28fa-4d97-8482-fd0dd12cc921" volumeName="kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: I0319 09:23:34.818596 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03d12dab-1215-4c1f-a9f5-27ea7174d308" volumeName="kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-bound-sa-token" seLinuxMountContext="" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: I0319 09:23:34.818612 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="157e3524-eb27-41ca-b49d-2697ee1245ca" volumeName="kubernetes.io/projected/157e3524-eb27-41ca-b49d-2697ee1245ca-kube-api-access-qhzsr" seLinuxMountContext="" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: I0319 09:23:34.818627 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d5e311c-1c6a-4d5d-8c2b-493025593934" volumeName="kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-proxy-ca-bundles" seLinuxMountContext="" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: I0319 09:23:34.818644 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8527f5cd-2992-44be-90b8-e9086cedf46e" volumeName="kubernetes.io/configmap/8527f5cd-2992-44be-90b8-e9086cedf46e-config" seLinuxMountContext="" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: I0319 09:23:34.818659 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c8ee765-76b8-4cde-8acb-6e5edd1b8149" volumeName="kubernetes.io/configmap/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-telemetry-config" seLinuxMountContext="" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: I0319 09:23:34.818675 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="979d4d12-a560-4309-a1d3-cbebe853e8ea" volumeName="kubernetes.io/projected/979d4d12-a560-4309-a1d3-cbebe853e8ea-kube-api-access-rxjqg" seLinuxMountContext="" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: I0319 09:23:34.818689 13205 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" volumeName="kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-config" seLinuxMountContext="" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: I0319 09:23:34.818702 13205 reconstruct.go:97] "Volume reconstruction finished" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: I0319 09:23:34.818711 13205 reconciler.go:26] "Reconciler: start to sync state" Mar 19 09:23:34.818827 master-0 kubenswrapper[13205]: E0319 09:23:34.818747 13205 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 19 09:23:34.845823 master-0 kubenswrapper[13205]: I0319 09:23:34.845742 13205 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 09:23:34.847775 master-0 kubenswrapper[13205]: I0319 09:23:34.847738 13205 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 09:23:34.847837 master-0 kubenswrapper[13205]: I0319 09:23:34.847809 13205 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 09:23:34.847874 master-0 kubenswrapper[13205]: I0319 09:23:34.847844 13205 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 09:23:34.847942 master-0 kubenswrapper[13205]: E0319 09:23:34.847922 13205 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 09:23:34.852610 master-0 kubenswrapper[13205]: W0319 09:23:34.851664 13205 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:23:34.852610 master-0 kubenswrapper[13205]: E0319 09:23:34.851802 13205 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:23:34.862304 master-0 kubenswrapper[13205]: I0319 09:23:34.862214 13205 generic.go:334] "Generic (PLEG): container finished" podID="c5966fa8-b9f0-42ee-a75b-20014782366d" containerID="eb4cfd7b6c5813e770652456c0da282ad0cb1d1a21af6516f6555be2f1978f99" exitCode=0 Mar 19 09:23:34.862304 master-0 kubenswrapper[13205]: I0319 09:23:34.862270 13205 generic.go:334] "Generic (PLEG): container finished" podID="c5966fa8-b9f0-42ee-a75b-20014782366d" containerID="1d94d4e69569ac4f86a917501b5ce54c3042abc1e756a92eeb7e23135f068b96" exitCode=0 Mar 19 09:23:34.867653 master-0 kubenswrapper[13205]: I0319 09:23:34.867600 13205 generic.go:334] "Generic (PLEG): container finished" podID="3b50118d-f7c2-4bff-aca0-5c6623819baf" containerID="4be81bb4984289b1445cac9a7d29a8575166e1227c0a164f03ad826b2adf5846" exitCode=0 Mar 19 09:23:34.867653 master-0 kubenswrapper[13205]: I0319 09:23:34.867639 13205 generic.go:334] "Generic (PLEG): container finished" podID="3b50118d-f7c2-4bff-aca0-5c6623819baf" containerID="a44bc43b2d58d1a0d645e857d97d66ce4eb842ccd368241fdd8860524859bfed" exitCode=0 Mar 19 09:23:34.867653 master-0 kubenswrapper[13205]: I0319 09:23:34.867647 13205 generic.go:334] "Generic (PLEG): container finished" podID="3b50118d-f7c2-4bff-aca0-5c6623819baf" containerID="eda46613435f0ad25039ff0c6a8755c37babfb4638110ab33aa3ce1f440dd317" exitCode=0 Mar 19 09:23:34.883914 master-0 kubenswrapper[13205]: I0319 09:23:34.883841 13205 generic.go:334] "Generic (PLEG): container finished" podID="5a51c701-7f2a-4332-a301-746e8a0eb475" containerID="f8a8d2ffc695381746f012239f51a0188a35a78cf69857ab2089866fefe4ec7f" exitCode=0 Mar 19 09:23:34.889291 master-0 kubenswrapper[13205]: E0319 09:23:34.889253 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:23:34.890184 master-0 kubenswrapper[13205]: I0319 09:23:34.890148 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-v9898_8527f5cd-2992-44be-90b8-e9086cedf46e/openshift-controller-manager-operator/0.log" Mar 19 09:23:34.890231 master-0 kubenswrapper[13205]: I0319 09:23:34.890197 13205 generic.go:334] "Generic (PLEG): container finished" podID="8527f5cd-2992-44be-90b8-e9086cedf46e" containerID="c9da4601818f501772c5c387239e3219ab4432a2bb45b7271b716c82c40ddaf7" exitCode=1 Mar 19 09:23:34.899478 master-0 kubenswrapper[13205]: I0319 09:23:34.899417 13205 generic.go:334] "Generic (PLEG): container finished" podID="64f60856-22dd-4560-acff-c620e17844a1" containerID="a09f4dbb43b5d238fce47264297abcc4f7a5bcbcd572aa1022072a1fc9dfe9a1" exitCode=0 Mar 19 09:23:34.901885 master-0 kubenswrapper[13205]: I0319 09:23:34.901834 13205 generic.go:334] "Generic (PLEG): container finished" podID="ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2" containerID="515536d3327c4350c49115e45e5daea011f0dd92405ace02a20c785016fd27e4" exitCode=0 Mar 19 09:23:34.901885 master-0 kubenswrapper[13205]: I0319 09:23:34.901880 13205 generic.go:334] "Generic (PLEG): container finished" podID="ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2" containerID="6a4a22a5cf9a6cf3b1dd393632748fd3e2677a48e7f3293f0563dbc6ae33d7aa" exitCode=0 Mar 19 09:23:34.903710 master-0 kubenswrapper[13205]: I0319 09:23:34.903660 13205 generic.go:334] "Generic (PLEG): container finished" podID="ac3507630eeeca1ec26dca5ed036e3bb" containerID="fe4aa11ba9b87ba831dffe9e66c7f29d228e243f676763cd967178740391f529" exitCode=0 Mar 19 09:23:34.907836 master-0 kubenswrapper[13205]: E0319 09:23:34.907801 13205 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:23:34.909187 master-0 kubenswrapper[13205]: I0319 09:23:34.909140 13205 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="6b51526a63cb4fc4843a03fc75fd50c63454c0795793d3149e658718010b95b1" exitCode=0 Mar 19 09:23:34.915661 master-0 kubenswrapper[13205]: I0319 09:23:34.915592 13205 generic.go:334] "Generic (PLEG): container finished" podID="0bce9154-cd31-4c4a-9d86-2903d5b1adad" containerID="20ad0dc2c8fe0c77234c92295139868f3667eee62c0b6d6d6951ddd42c52079f" exitCode=0 Mar 19 09:23:34.915661 master-0 kubenswrapper[13205]: I0319 09:23:34.915630 13205 generic.go:334] "Generic (PLEG): container finished" podID="0bce9154-cd31-4c4a-9d86-2903d5b1adad" containerID="bf149ff2c777ec19da6a404f555dbbecaec9d99f5badeb4692ea25e2aab65ea8" exitCode=0 Mar 19 09:23:34.917648 master-0 kubenswrapper[13205]: I0319 09:23:34.917607 13205 generic.go:334] "Generic (PLEG): container finished" podID="014ef8bd-b940-41e2-9239-c238afe6ebae" containerID="63e480bd33c67f5ddbdb4cc89c4a2b081a014d57d3304086d0ed39b6f8f0a797" exitCode=0 Mar 19 09:23:34.921356 master-0 kubenswrapper[13205]: I0319 09:23:34.921289 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzdzd_157e3524-eb27-41ca-b49d-2697ee1245ca/kube-multus/0.log" Mar 19 09:23:34.921356 master-0 kubenswrapper[13205]: I0319 09:23:34.921334 13205 generic.go:334] "Generic (PLEG): container finished" podID="157e3524-eb27-41ca-b49d-2697ee1245ca" containerID="2d3477c3a9725b873c8e5413ca72191db0e07b17ecaa8a6d3f792473fd194137" exitCode=1 Mar 19 09:23:34.922971 master-0 kubenswrapper[13205]: I0319 09:23:34.922939 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_434aabfa-50db-407e-92d3-a034696613e3/installer/0.log" Mar 19 09:23:34.923054 master-0 kubenswrapper[13205]: I0319 09:23:34.923026 13205 generic.go:334] "Generic (PLEG): container finished" podID="434aabfa-50db-407e-92d3-a034696613e3" containerID="5a3b40e5aadf949e686ac4f447f2417ca9edf3ac74f9cc8e180b0ad3fbdc1cbc" exitCode=1 Mar 19 09:23:34.927786 master-0 kubenswrapper[13205]: I0319 09:23:34.927729 13205 generic.go:334] "Generic (PLEG): container finished" podID="3b333a1e-2a7f-423a-8b40-99f30c89f740" containerID="e7857b0cae9f1e592c846367f20964b7bdba92f2c028bce9260e23037d2618d9" exitCode=0 Mar 19 09:23:34.931766 master-0 kubenswrapper[13205]: I0319 09:23:34.931464 13205 generic.go:334] "Generic (PLEG): container finished" podID="d9eb3750-cb7b-4d3c-88bc-d1b68a370872" containerID="0ce27311ef590bbffcd62b67c2b6ee4f6f31b7ee4bc36c74deac775d99e52498" exitCode=0 Mar 19 09:23:34.933657 master-0 kubenswrapper[13205]: I0319 09:23:34.933609 13205 generic.go:334] "Generic (PLEG): container finished" podID="e7fae040-28fa-4d97-8482-fd0dd12cc921" containerID="7899eaeea83e799e75607f310011944713a832305f4796c7131bde2f6c40224c" exitCode=0 Mar 19 09:23:34.943694 master-0 kubenswrapper[13205]: I0319 09:23:34.943640 13205 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c" exitCode=0 Mar 19 09:23:34.943694 master-0 kubenswrapper[13205]: I0319 09:23:34.943676 13205 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360" exitCode=0 Mar 19 09:23:34.943694 master-0 kubenswrapper[13205]: I0319 09:23:34.943683 13205 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0" exitCode=0 Mar 19 09:23:34.948795 master-0 kubenswrapper[13205]: E0319 09:23:34.948749 13205 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 09:23:34.949396 master-0 kubenswrapper[13205]: I0319 09:23:34.949349 13205 generic.go:334] "Generic (PLEG): container finished" podID="741c9d25-7634-41c0-bfe4-b7a15de4b341" containerID="e404f4723d8631f073faddbc4262589635650594028697bdf7da895f9918c63d" exitCode=0 Mar 19 09:23:34.951976 master-0 kubenswrapper[13205]: I0319 09:23:34.951932 13205 generic.go:334] "Generic (PLEG): container finished" podID="ff98fb1e-7a1f-4657-b085-743d6f2d28e2" containerID="f2f4573ac6359250badfecb43e628f11a57ba451127ad683fe2723ca4c3b389c" exitCode=0 Mar 19 09:23:34.955528 master-0 kubenswrapper[13205]: I0319 09:23:34.955480 13205 generic.go:334] "Generic (PLEG): container finished" podID="1694c93a-9acb-4bec-bfd6-3ec370e7a0b4" containerID="33c416f3ddb853fb82ea998149e13a2a8f2bd563b1774b31ddf6b2c491ae3aa9" exitCode=0 Mar 19 09:23:34.960746 master-0 kubenswrapper[13205]: I0319 09:23:34.960693 13205 generic.go:334] "Generic (PLEG): container finished" podID="43cb2a3b-40e2-45ee-894a-6c833ee17efd" containerID="c4276c1e12973c262c98545548719e35835681298a10338c9d6009cc8f7eb867" exitCode=0 Mar 19 09:23:34.961973 master-0 kubenswrapper[13205]: I0319 09:23:34.961925 13205 generic.go:334] "Generic (PLEG): container finished" podID="43ca4232-9e9c-4b97-9c29-bead80a9a5fa" containerID="46c63e43dc61899ca4cb1732e5d7d4e693a722f5fb486db67fb30cfa5bfc8af5" exitCode=0 Mar 19 09:23:34.964578 master-0 kubenswrapper[13205]: I0319 09:23:34.964296 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-68f85b4d6c-jg9m5_259794ab-d027-497a-b08e-5a6d79057668/catalog-operator/0.log" Mar 19 09:23:34.964751 master-0 kubenswrapper[13205]: I0319 09:23:34.964599 13205 generic.go:334] "Generic (PLEG): container finished" podID="259794ab-d027-497a-b08e-5a6d79057668" containerID="0bb6d4411c90b21c40d2ebf35d55a831d972d567e97bde63d3acc0f2997756c7" exitCode=1 Mar 19 09:23:34.968145 master-0 kubenswrapper[13205]: I0319 09:23:34.968087 13205 generic.go:334] "Generic (PLEG): container finished" podID="d664acc4-ec4f-4078-ae93-404a14ea18fc" containerID="f068dc00867ec832963c43c66c2b3ba5e5c27207844ca25057536cc59dfa3810" exitCode=0 Mar 19 09:23:34.973481 master-0 kubenswrapper[13205]: I0319 09:23:34.973424 13205 generic.go:334] "Generic (PLEG): container finished" podID="a1098584-43b9-4f2c-83d2-22d95fb7b0c3" containerID="dbe5b6ac78d411669d4c2885f202f3dc2681af9deb4ef2161f47be9747a76bd6" exitCode=0 Mar 19 09:23:34.976317 master-0 kubenswrapper[13205]: I0319 09:23:34.976262 13205 generic.go:334] "Generic (PLEG): container finished" podID="f0c75102-6790-4ed3-84da-61c3611186f8" containerID="46cd0596efe1a555d079c79fdb72a64ad03bb94cd6e0d19c502033e4b3f35b63" exitCode=0 Mar 19 09:23:34.979013 master-0 kubenswrapper[13205]: I0319 09:23:34.978963 13205 generic.go:334] "Generic (PLEG): container finished" podID="bf5dde46-8a95-46a6-bee5-20d3a58f33ee" containerID="6c747f057ccd974ebffe1ad8f45ae4d2b2720a2e2e97d3a1aa69720b2461f5fb" exitCode=0 Mar 19 09:23:34.984991 master-0 kubenswrapper[13205]: I0319 09:23:34.984949 13205 generic.go:334] "Generic (PLEG): container finished" podID="c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a" containerID="301aebb0e9930fecf725f0201f719e1159eb2c1c4f88b41cf02dfb10a0bbec0d" exitCode=0 Mar 19 09:23:34.988088 master-0 kubenswrapper[13205]: I0319 09:23:34.988023 13205 generic.go:334] "Generic (PLEG): container finished" podID="4abcf2ea-50f5-4d62-8a23-583438e5b451" containerID="0ca36c4228886afdfb6b80a61e2423dd76188b00985839f0f0ad53c1f5d31db7" exitCode=0 Mar 19 09:23:34.989400 master-0 kubenswrapper[13205]: I0319 09:23:34.989361 13205 generic.go:334] "Generic (PLEG): container finished" podID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" containerID="d486a2c521f4c2c3eb232b1929f8a1ec255878f2382227f7f128e10063843ecc" exitCode=0 Mar 19 09:23:34.990744 master-0 kubenswrapper[13205]: I0319 09:23:34.990717 13205 generic.go:334] "Generic (PLEG): container finished" podID="84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" containerID="1e814e1f8603ada52f29b21f78df17d1b4dc0c1bc66fb422a5b77d8e27ae2d59" exitCode=0 Mar 19 09:23:34.994078 master-0 kubenswrapper[13205]: I0319 09:23:34.994048 13205 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="1de935d5d79686ee37ae77f43c7f709d103c6ab561712f1da495ac19ccceba4b" exitCode=0 Mar 19 09:23:34.994078 master-0 kubenswrapper[13205]: I0319 09:23:34.994070 13205 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="3d15d0fa4a3f8c9035c8ce9b72d3cf571d79c5e3676c413632c3d1ba3c37a426" exitCode=0 Mar 19 09:23:34.994078 master-0 kubenswrapper[13205]: I0319 09:23:34.994079 13205 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="2dce09604f673a98b5b76aa5ab393a537cdfc70dd6be1c99472f960c60ad55b9" exitCode=0 Mar 19 09:23:34.994234 master-0 kubenswrapper[13205]: I0319 09:23:34.994087 13205 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="886f43f428dc7d770e78699ea2b9793dc0fcaa7dc9eeaeafd637bd2727c22201" exitCode=0 Mar 19 09:23:34.994234 master-0 kubenswrapper[13205]: I0319 09:23:34.994095 13205 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="e715d0ff200bfc6a3198a0daa26814bad61e6acd8631c88afff9d4a08fe673ba" exitCode=0 Mar 19 09:23:34.994234 master-0 kubenswrapper[13205]: I0319 09:23:34.994103 13205 generic.go:334] "Generic (PLEG): container finished" podID="979d4d12-a560-4309-a1d3-cbebe853e8ea" containerID="11e09cac68fe5f9a91247cf89d443e062789ce0301fe0e6f213f48df912e0870" exitCode=0 Mar 19 09:23:34.995702 master-0 kubenswrapper[13205]: I0319 09:23:34.995672 13205 generic.go:334] "Generic (PLEG): container finished" podID="4d2c5580-36f6-4107-af53-cfbd15080b30" containerID="e8716e6b475b5ecb7bf3ee3310e613b5df09c280bbb15165ff8d6e13b5af9e6b" exitCode=0 Mar 19 09:23:34.995702 master-0 kubenswrapper[13205]: I0319 09:23:34.995694 13205 generic.go:334] "Generic (PLEG): container finished" podID="4d2c5580-36f6-4107-af53-cfbd15080b30" containerID="82319940bf8e72e7e1c996daea2af1c07c45f38503055b429ac09e5abb8f28d6" exitCode=0 Mar 19 09:23:34.999471 master-0 kubenswrapper[13205]: I0319 09:23:34.999442 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 09:23:34.999808 master-0 kubenswrapper[13205]: I0319 09:23:34.999775 13205 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="ff92ed2afc9866bcdca7010b112a4b7e2fe7402710ba37be20aa1e6f3111dc9b" exitCode=1 Mar 19 09:23:34.999897 master-0 kubenswrapper[13205]: I0319 09:23:34.999883 13205 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="e5e1897ddbf62a1e1975ee8d4b56ad3a8cd0b0cf3d4e0758eac825b5a75e9b66" exitCode=0 Mar 19 09:23:35.003886 master-0 kubenswrapper[13205]: I0319 09:23:35.003849 13205 generic.go:334] "Generic (PLEG): container finished" podID="e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb" containerID="2c0b681fce22722dc6bda98cf745e2b79d2558bec9534ca23b3f5d2d7fcdef7a" exitCode=0 Mar 19 09:23:35.005574 master-0 kubenswrapper[13205]: I0319 09:23:35.005501 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-5c9796789-rh692_7b29cb7b-26d2-4fab-9e03-2d7fdf937592/olm-operator/0.log" Mar 19 09:23:35.005574 master-0 kubenswrapper[13205]: I0319 09:23:35.005549 13205 generic.go:334] "Generic (PLEG): container finished" podID="7b29cb7b-26d2-4fab-9e03-2d7fdf937592" containerID="2aae1324ea9ac71e757c6b6742bbfe17bf26ff22a4f1597837954f981813c18e" exitCode=1 Mar 19 09:23:35.007493 master-0 kubenswrapper[13205]: I0319 09:23:35.007464 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-slmgx_58ea8fcc-29b2-48ef-8629-2ba217c9d70c/approver/0.log" Mar 19 09:23:35.007865 master-0 kubenswrapper[13205]: I0319 09:23:35.007835 13205 generic.go:334] "Generic (PLEG): container finished" podID="58ea8fcc-29b2-48ef-8629-2ba217c9d70c" containerID="16dabbfac23a88b18e7a1e5f639f318226358e768cd4e0f4bf6b8327e7b845c9" exitCode=1 Mar 19 09:23:35.007865 master-0 kubenswrapper[13205]: E0319 09:23:35.007854 13205 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:23:35.010333 master-0 kubenswrapper[13205]: I0319 09:23:35.010290 13205 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="a80a075ae2d2bfe0e545df390d9ff0ad18516cad1ed3ad4a716e570d8e5f21c1" exitCode=0 Mar 19 09:23:35.013836 master-0 kubenswrapper[13205]: E0319 09:23:35.013789 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 09:23:35.108303 master-0 kubenswrapper[13205]: E0319 09:23:35.108233 13205 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:23:35.127914 master-0 kubenswrapper[13205]: I0319 09:23:35.127846 13205 manager.go:324] Recovery completed Mar 19 09:23:35.148983 master-0 kubenswrapper[13205]: E0319 09:23:35.148859 13205 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 09:23:35.177882 master-0 kubenswrapper[13205]: E0319 09:23:35.177806 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:23:35.178261 master-0 kubenswrapper[13205]: W0319 09:23:35.178204 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 WatchSource:0}: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:23:35.192542 master-0 kubenswrapper[13205]: I0319 09:23:35.192444 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.195223 master-0 kubenswrapper[13205]: I0319 09:23:35.195161 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.195223 master-0 kubenswrapper[13205]: I0319 09:23:35.195217 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.195350 master-0 kubenswrapper[13205]: I0319 09:23:35.195232 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.198427 master-0 kubenswrapper[13205]: I0319 09:23:35.198382 13205 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 09:23:35.198427 master-0 kubenswrapper[13205]: I0319 09:23:35.198415 13205 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 09:23:35.198562 master-0 kubenswrapper[13205]: I0319 09:23:35.198445 13205 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:23:35.198756 master-0 kubenswrapper[13205]: I0319 09:23:35.198721 13205 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 19 09:23:35.198813 master-0 kubenswrapper[13205]: I0319 09:23:35.198745 13205 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 19 09:23:35.198813 master-0 kubenswrapper[13205]: I0319 09:23:35.198773 13205 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 19 09:23:35.198813 master-0 kubenswrapper[13205]: I0319 09:23:35.198783 13205 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 19 09:23:35.198813 master-0 kubenswrapper[13205]: I0319 09:23:35.198791 13205 policy_none.go:49] "None policy: Start" Mar 19 09:23:35.203067 master-0 kubenswrapper[13205]: I0319 09:23:35.202917 13205 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 09:23:35.203067 master-0 kubenswrapper[13205]: I0319 09:23:35.203062 13205 state_mem.go:35] "Initializing new in-memory state store" Mar 19 09:23:35.203567 master-0 kubenswrapper[13205]: I0319 09:23:35.203486 13205 state_mem.go:75] "Updated machine memory state" Mar 19 09:23:35.203567 master-0 kubenswrapper[13205]: I0319 09:23:35.203518 13205 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 19 09:23:35.209219 master-0 kubenswrapper[13205]: E0319 09:23:35.209138 13205 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:23:35.216629 master-0 kubenswrapper[13205]: I0319 09:23:35.216597 13205 manager.go:334] "Starting Device Plugin manager" Mar 19 09:23:35.216777 master-0 kubenswrapper[13205]: I0319 09:23:35.216749 13205 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 09:23:35.216777 master-0 kubenswrapper[13205]: I0319 09:23:35.216773 13205 server.go:79] "Starting device plugin registration server" Mar 19 09:23:35.217215 master-0 kubenswrapper[13205]: I0319 09:23:35.217182 13205 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 09:23:35.217285 master-0 kubenswrapper[13205]: I0319 09:23:35.217204 13205 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 09:23:35.217506 master-0 kubenswrapper[13205]: I0319 09:23:35.217475 13205 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 09:23:35.217617 master-0 kubenswrapper[13205]: I0319 09:23:35.217592 13205 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 09:23:35.217617 master-0 kubenswrapper[13205]: I0319 09:23:35.217608 13205 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 09:23:35.223727 master-0 kubenswrapper[13205]: E0319 09:23:35.223681 13205 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:23:35.317979 master-0 kubenswrapper[13205]: I0319 09:23:35.317859 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.319710 master-0 kubenswrapper[13205]: I0319 09:23:35.319676 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.319786 master-0 kubenswrapper[13205]: I0319 09:23:35.319726 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.319786 master-0 kubenswrapper[13205]: I0319 09:23:35.319736 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.319786 master-0 kubenswrapper[13205]: I0319 09:23:35.319756 13205 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:23:35.320952 master-0 kubenswrapper[13205]: E0319 09:23:35.320830 13205 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:23:35.415405 master-0 kubenswrapper[13205]: E0319 09:23:35.415251 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 09:23:35.521996 master-0 kubenswrapper[13205]: I0319 09:23:35.521881 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.524507 master-0 kubenswrapper[13205]: I0319 09:23:35.524459 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.524601 master-0 kubenswrapper[13205]: I0319 09:23:35.524515 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.524601 master-0 kubenswrapper[13205]: I0319 09:23:35.524554 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.524601 master-0 kubenswrapper[13205]: I0319 09:23:35.524584 13205 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:23:35.525765 master-0 kubenswrapper[13205]: E0319 09:23:35.525708 13205 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:23:35.549758 master-0 kubenswrapper[13205]: I0319 09:23:35.549619 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:23:35.549936 master-0 kubenswrapper[13205]: I0319 09:23:35.549798 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.552049 master-0 kubenswrapper[13205]: I0319 09:23:35.552004 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.552049 master-0 kubenswrapper[13205]: I0319 09:23:35.552050 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.552204 master-0 kubenswrapper[13205]: I0319 09:23:35.552060 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.552250 master-0 kubenswrapper[13205]: I0319 09:23:35.552215 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.552374 master-0 kubenswrapper[13205]: I0319 09:23:35.552343 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.554340 master-0 kubenswrapper[13205]: I0319 09:23:35.554298 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.554340 master-0 kubenswrapper[13205]: I0319 09:23:35.554336 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.554456 master-0 kubenswrapper[13205]: I0319 09:23:35.554348 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.554618 master-0 kubenswrapper[13205]: I0319 09:23:35.554591 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.554802 master-0 kubenswrapper[13205]: I0319 09:23:35.554762 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.554867 master-0 kubenswrapper[13205]: I0319 09:23:35.554807 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.554867 master-0 kubenswrapper[13205]: I0319 09:23:35.554808 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.554956 master-0 kubenswrapper[13205]: I0319 09:23:35.554820 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.557199 master-0 kubenswrapper[13205]: I0319 09:23:35.557164 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.557199 master-0 kubenswrapper[13205]: I0319 09:23:35.557203 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.557349 master-0 kubenswrapper[13205]: I0319 09:23:35.557212 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.557349 master-0 kubenswrapper[13205]: I0319 09:23:35.557348 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.557489 master-0 kubenswrapper[13205]: I0319 09:23:35.557402 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.557566 master-0 kubenswrapper[13205]: I0319 09:23:35.557490 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.557566 master-0 kubenswrapper[13205]: I0319 09:23:35.557518 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.557637 master-0 kubenswrapper[13205]: I0319 09:23:35.557611 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.560250 master-0 kubenswrapper[13205]: I0319 09:23:35.559998 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.560250 master-0 kubenswrapper[13205]: I0319 09:23:35.560036 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.560250 master-0 kubenswrapper[13205]: I0319 09:23:35.560070 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.560455 master-0 kubenswrapper[13205]: I0319 09:23:35.560367 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.560679 master-0 kubenswrapper[13205]: I0319 09:23:35.560651 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.561273 master-0 kubenswrapper[13205]: I0319 09:23:35.561232 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.561335 master-0 kubenswrapper[13205]: I0319 09:23:35.561282 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.561335 master-0 kubenswrapper[13205]: I0319 09:23:35.561295 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.563743 master-0 kubenswrapper[13205]: I0319 09:23:35.563699 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.563743 master-0 kubenswrapper[13205]: I0319 09:23:35.563739 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.563743 master-0 kubenswrapper[13205]: I0319 09:23:35.563750 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.563888 master-0 kubenswrapper[13205]: I0319 09:23:35.563776 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.563888 master-0 kubenswrapper[13205]: I0319 09:23:35.563804 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.563888 master-0 kubenswrapper[13205]: I0319 09:23:35.563813 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.563971 master-0 kubenswrapper[13205]: I0319 09:23:35.563891 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.563971 master-0 kubenswrapper[13205]: I0319 09:23:35.563949 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.566374 master-0 kubenswrapper[13205]: I0319 09:23:35.566339 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.566374 master-0 kubenswrapper[13205]: I0319 09:23:35.566363 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.566374 master-0 kubenswrapper[13205]: I0319 09:23:35.566372 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.566698 master-0 kubenswrapper[13205]: I0319 09:23:35.566666 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.566698 master-0 kubenswrapper[13205]: I0319 09:23:35.566697 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.566785 master-0 kubenswrapper[13205]: I0319 09:23:35.566706 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.566898 master-0 kubenswrapper[13205]: I0319 09:23:35.566831 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"e7ac13cba0a41afefd1f1913bc7aba4a187c6d99752100ec1e36b10b44ac9c6a"} Mar 19 09:23:35.566932 master-0 kubenswrapper[13205]: I0319 09:23:35.566894 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"d2773f59c5e5fc7c4c20d27964b8855d429ffb69ddd44594d1e039aab3c6d9c7"} Mar 19 09:23:35.566932 master-0 kubenswrapper[13205]: I0319 09:23:35.566910 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"1cd8ba1cf946b8e03e8d14ad1a9ca15bc751df12a73a64e9d4a3982985753d17"} Mar 19 09:23:35.566932 master-0 kubenswrapper[13205]: I0319 09:23:35.566921 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"9889603cf425a1afe622f697ec4d233d82f7e355b75cc078b65e38e02fed7bd5"} Mar 19 09:23:35.567016 master-0 kubenswrapper[13205]: I0319 09:23:35.566933 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"1132a72c9136c6d33d6382355fa3991b260d8a3776fc503599fe4ecedb8985b2"} Mar 19 09:23:35.567016 master-0 kubenswrapper[13205]: I0319 09:23:35.566949 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.567016 master-0 kubenswrapper[13205]: I0319 09:23:35.566963 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"ac3507630eeeca1ec26dca5ed036e3bb","Type":"ContainerDied","Data":"fe4aa11ba9b87ba831dffe9e66c7f29d228e243f676763cd967178740391f529"} Mar 19 09:23:35.567016 master-0 kubenswrapper[13205]: I0319 09:23:35.566973 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"ac3507630eeeca1ec26dca5ed036e3bb","Type":"ContainerStarted","Data":"8bbb7eb717a10731a76fbab7e75a4760990dac18f169f5c55d4ff290082a576b"} Mar 19 09:23:35.567016 master-0 kubenswrapper[13205]: I0319 09:23:35.566982 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"793cfb93f2346e0ad23e32cbd1e114aae92c03db2ff0726f899f8a1c39d66416"} Mar 19 09:23:35.567016 master-0 kubenswrapper[13205]: I0319 09:23:35.566992 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"6d8e777ee2c690477b890e212d15377f6f78a023a47f6d1ccdb66d4fd4236c20"} Mar 19 09:23:35.567016 master-0 kubenswrapper[13205]: I0319 09:23:35.567000 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"57919871ecdce20adcf14d4b3e782688c40e27d380e27e5683da1cfdca89a184"} Mar 19 09:23:35.567016 master-0 kubenswrapper[13205]: I0319 09:23:35.567009 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerDied","Data":"6b51526a63cb4fc4843a03fc75fd50c63454c0795793d3149e658718010b95b1"} Mar 19 09:23:35.567016 master-0 kubenswrapper[13205]: I0319 09:23:35.567019 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"d6cda39585354e47346ec04d7e9023161d8c669dfe02492069483d076fdb9801"} Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567031 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a1c72bc9f1c7efb1bcafcf4f7660e88081a0397f913b23e1285005ab7524d43" Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567051 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="419f90df85200464073bb55727a37114d61c84e4d555b334b5798b07351fb1d6" Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567068 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f695fbbb2ac33712845536e84cccd0ed476913549534361c504ad37ba881e39" Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567096 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492"} Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567105 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6"} Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567113 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23"} Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567121 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb"} Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567130 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51"} Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567138 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c"} Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567147 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360"} Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567156 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0"} Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567164 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"7cd4cfeb35d9cb7e8bc213abe4e5f2a9ecc0b4807e7e9244214faaeba9632ab5"} Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567197 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c82ea247d4a04551474a0bf79f03cd9f98a0925e6c68fa6c0c9d75dba8c1773c" Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567209 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"95378a840215d5780aa88df876aac909","Type":"ContainerStarted","Data":"6515162c88f5a0e7101bd8f4c9ab9f4bbb0fb9d6b63a2db99d70609588290bb3"} Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567217 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"95378a840215d5780aa88df876aac909","Type":"ContainerStarted","Data":"76a2be65b345aaa03d42847ddf4106be40d256a72f66630810b64aeb72f9c081"} Mar 19 09:23:35.567246 master-0 kubenswrapper[13205]: I0319 09:23:35.567247 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5102e60e07b0c0187e422871eca34d56a4b64890354534ac5bb4405ed5a663d3" Mar 19 09:23:35.567705 master-0 kubenswrapper[13205]: I0319 09:23:35.567263 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea06326f75dbe8dd7c60652c7838fe0eb8d997984652bd4f5b739f7370b57187" Mar 19 09:23:35.567705 master-0 kubenswrapper[13205]: I0319 09:23:35.567272 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ee23309cc49a7d14cc3f6a92bd46ad644b4ffd9ef72d1521784109d325534ba" Mar 19 09:23:35.567705 master-0 kubenswrapper[13205]: I0319 09:23:35.567297 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"792d2b5907d7be3b52add934725c063cf367a575639846dbd622e4989463bf6d"} Mar 19 09:23:35.567705 master-0 kubenswrapper[13205]: I0319 09:23:35.567306 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"ff92ed2afc9866bcdca7010b112a4b7e2fe7402710ba37be20aa1e6f3111dc9b"} Mar 19 09:23:35.567705 master-0 kubenswrapper[13205]: I0319 09:23:35.567316 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"e5e1897ddbf62a1e1975ee8d4b56ad3a8cd0b0cf3d4e0758eac825b5a75e9b66"} Mar 19 09:23:35.567705 master-0 kubenswrapper[13205]: I0319 09:23:35.567325 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"5f0b7606a412dcca4dd370553910b12ad443e3587ee9a8d70a1100b889c51bbc"} Mar 19 09:23:35.567705 master-0 kubenswrapper[13205]: I0319 09:23:35.567353 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="205c73b1ea5a301df50c88c2833b1992d29a39f06232166d5125d802ffe3e979" Mar 19 09:23:35.567705 master-0 kubenswrapper[13205]: I0319 09:23:35.567361 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="628162d008ef66056da78d4bcff9fb80227ffcc627a246a21dbba2cd871accd4" Mar 19 09:23:35.568187 master-0 kubenswrapper[13205]: I0319 09:23:35.568151 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.568225 master-0 kubenswrapper[13205]: I0319 09:23:35.568196 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.568225 master-0 kubenswrapper[13205]: I0319 09:23:35.568208 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.629913 master-0 kubenswrapper[13205]: I0319 09:23:35.629854 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.630131 master-0 kubenswrapper[13205]: I0319 09:23:35.629939 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.630131 master-0 kubenswrapper[13205]: I0319 09:23:35.629961 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:35.630131 master-0 kubenswrapper[13205]: I0319 09:23:35.629985 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:35.630131 master-0 kubenswrapper[13205]: I0319 09:23:35.630001 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:23:35.630131 master-0 kubenswrapper[13205]: I0319 09:23:35.630016 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.630131 master-0 kubenswrapper[13205]: I0319 09:23:35.630034 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.630369 master-0 kubenswrapper[13205]: I0319 09:23:35.630107 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:35.630369 master-0 kubenswrapper[13205]: I0319 09:23:35.630203 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.630369 master-0 kubenswrapper[13205]: I0319 09:23:35.630230 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.630369 master-0 kubenswrapper[13205]: I0319 09:23:35.630249 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.630369 master-0 kubenswrapper[13205]: I0319 09:23:35.630267 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:35.630369 master-0 kubenswrapper[13205]: I0319 09:23:35.630310 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.630369 master-0 kubenswrapper[13205]: I0319 09:23:35.630327 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.630369 master-0 kubenswrapper[13205]: I0319 09:23:35.630344 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:35.630369 master-0 kubenswrapper[13205]: I0319 09:23:35.630361 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:23:35.630634 master-0 kubenswrapper[13205]: I0319 09:23:35.630378 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.630634 master-0 kubenswrapper[13205]: I0319 09:23:35.630396 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:35.630634 master-0 kubenswrapper[13205]: I0319 09:23:35.630418 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:35.630634 master-0 kubenswrapper[13205]: I0319 09:23:35.630439 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.715821 master-0 kubenswrapper[13205]: W0319 09:23:35.715576 13205 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:23:35.715821 master-0 kubenswrapper[13205]: E0319 09:23:35.715685 13205 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:23:35.732284 master-0 kubenswrapper[13205]: I0319 09:23:35.732180 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.732284 master-0 kubenswrapper[13205]: I0319 09:23:35.732256 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:23:35.732284 master-0 kubenswrapper[13205]: I0319 09:23:35.732277 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.732284 master-0 kubenswrapper[13205]: I0319 09:23:35.732298 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.732697 master-0 kubenswrapper[13205]: I0319 09:23:35.732446 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.732697 master-0 kubenswrapper[13205]: I0319 09:23:35.732589 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.732697 master-0 kubenswrapper[13205]: I0319 09:23:35.732587 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:35.732794 master-0 kubenswrapper[13205]: I0319 09:23:35.732694 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.732794 master-0 kubenswrapper[13205]: I0319 09:23:35.732712 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.732794 master-0 kubenswrapper[13205]: I0319 09:23:35.732728 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.732890 master-0 kubenswrapper[13205]: I0319 09:23:35.732759 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:35.732890 master-0 kubenswrapper[13205]: I0319 09:23:35.732806 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.732890 master-0 kubenswrapper[13205]: I0319 09:23:35.732829 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:23:35.732890 master-0 kubenswrapper[13205]: I0319 09:23:35.732849 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:35.732998 master-0 kubenswrapper[13205]: I0319 09:23:35.732868 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.732998 master-0 kubenswrapper[13205]: I0319 09:23:35.732850 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.732998 master-0 kubenswrapper[13205]: I0319 09:23:35.732936 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.732998 master-0 kubenswrapper[13205]: I0319 09:23:35.732939 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:35.732998 master-0 kubenswrapper[13205]: I0319 09:23:35.732981 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.733150 master-0 kubenswrapper[13205]: I0319 09:23:35.733010 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.733150 master-0 kubenswrapper[13205]: I0319 09:23:35.733072 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.733212 master-0 kubenswrapper[13205]: I0319 09:23:35.733142 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:35.733212 master-0 kubenswrapper[13205]: I0319 09:23:35.733179 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:23:35.733212 master-0 kubenswrapper[13205]: I0319 09:23:35.733205 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.733301 master-0 kubenswrapper[13205]: I0319 09:23:35.733230 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:35.733301 master-0 kubenswrapper[13205]: I0319 09:23:35.733255 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:35.733301 master-0 kubenswrapper[13205]: I0319 09:23:35.733259 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:35.733301 master-0 kubenswrapper[13205]: I0319 09:23:35.733288 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:35.733428 master-0 kubenswrapper[13205]: I0319 09:23:35.733285 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.733428 master-0 kubenswrapper[13205]: I0319 09:23:35.733342 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:35.733428 master-0 kubenswrapper[13205]: I0319 09:23:35.733375 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.733428 master-0 kubenswrapper[13205]: I0319 09:23:35.733393 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:35.733428 master-0 kubenswrapper[13205]: I0319 09:23:35.733410 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.733607 master-0 kubenswrapper[13205]: I0319 09:23:35.733429 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:23:35.733607 master-0 kubenswrapper[13205]: I0319 09:23:35.733446 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:35.733607 master-0 kubenswrapper[13205]: I0319 09:23:35.733473 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.733607 master-0 kubenswrapper[13205]: I0319 09:23:35.733507 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:35.733607 master-0 kubenswrapper[13205]: I0319 09:23:35.733596 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"95378a840215d5780aa88df876aac909\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:23:35.733750 master-0 kubenswrapper[13205]: I0319 09:23:35.733609 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:23:35.733750 master-0 kubenswrapper[13205]: I0319 09:23:35.733642 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:35.794863 master-0 kubenswrapper[13205]: I0319 09:23:35.794804 13205 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:23:35.864908 master-0 kubenswrapper[13205]: I0319 09:23:35.864856 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.867698 master-0 kubenswrapper[13205]: I0319 09:23:35.867655 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.867766 master-0 kubenswrapper[13205]: I0319 09:23:35.867704 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.867766 master-0 kubenswrapper[13205]: I0319 09:23:35.867714 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.926296 master-0 kubenswrapper[13205]: I0319 09:23:35.926116 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:35.928767 master-0 kubenswrapper[13205]: I0319 09:23:35.928712 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:35.928849 master-0 kubenswrapper[13205]: I0319 09:23:35.928774 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:35.928849 master-0 kubenswrapper[13205]: I0319 09:23:35.928788 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:35.928849 master-0 kubenswrapper[13205]: I0319 09:23:35.928810 13205 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:23:35.930005 master-0 kubenswrapper[13205]: E0319 09:23:35.929793 13205 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:23:36.021059 master-0 kubenswrapper[13205]: I0319 09:23:36.020999 13205 generic.go:334] "Generic (PLEG): container finished" podID="bf5dde46-8a95-46a6-bee5-20d3a58f33ee" containerID="81250100c299ae118bbe5f9dc4bbd12fa1d14954ac94a5bc4484fa54e2475d2b" exitCode=0 Mar 19 09:23:36.029432 master-0 kubenswrapper[13205]: I0319 09:23:36.029390 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/0.log" Mar 19 09:23:36.029586 master-0 kubenswrapper[13205]: I0319 09:23:36.029453 13205 generic.go:334] "Generic (PLEG): container finished" podID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerID="9889603cf425a1afe622f697ec4d233d82f7e355b75cc078b65e38e02fed7bd5" exitCode=255 Mar 19 09:23:36.029586 master-0 kubenswrapper[13205]: I0319 09:23:36.029502 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerDied","Data":"9889603cf425a1afe622f697ec4d233d82f7e355b75cc078b65e38e02fed7bd5"} Mar 19 09:23:36.029658 master-0 kubenswrapper[13205]: I0319 09:23:36.029581 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:36.029658 master-0 kubenswrapper[13205]: I0319 09:23:36.029633 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:36.029658 master-0 kubenswrapper[13205]: I0319 09:23:36.029648 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:36.030755 master-0 kubenswrapper[13205]: I0319 09:23:36.030694 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:36.034113 master-0 kubenswrapper[13205]: I0319 09:23:36.030703 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:36.035985 master-0 kubenswrapper[13205]: I0319 09:23:36.035958 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:36.036114 master-0 kubenswrapper[13205]: I0319 09:23:36.036099 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:36.036203 master-0 kubenswrapper[13205]: I0319 09:23:36.036190 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:36.037175 master-0 kubenswrapper[13205]: I0319 09:23:36.037136 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:36.037239 master-0 kubenswrapper[13205]: I0319 09:23:36.037218 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:36.037282 master-0 kubenswrapper[13205]: I0319 09:23:36.037236 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:36.037673 master-0 kubenswrapper[13205]: I0319 09:23:36.036211 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:36.037767 master-0 kubenswrapper[13205]: I0319 09:23:36.037756 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:36.038319 master-0 kubenswrapper[13205]: I0319 09:23:36.038289 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:36.038362 master-0 kubenswrapper[13205]: I0319 09:23:36.038145 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:36.038362 master-0 kubenswrapper[13205]: I0319 09:23:36.038352 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:36.038426 master-0 kubenswrapper[13205]: I0319 09:23:36.038364 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:36.038426 master-0 kubenswrapper[13205]: I0319 09:23:36.036216 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:36.038506 master-0 kubenswrapper[13205]: I0319 09:23:36.038462 13205 scope.go:117] "RemoveContainer" containerID="9889603cf425a1afe622f697ec4d233d82f7e355b75cc078b65e38e02fed7bd5" Mar 19 09:23:36.038611 master-0 kubenswrapper[13205]: I0319 09:23:36.038468 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:36.038704 master-0 kubenswrapper[13205]: I0319 09:23:36.038673 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:36.217314 master-0 kubenswrapper[13205]: E0319 09:23:36.216798 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 09:23:36.242553 master-0 kubenswrapper[13205]: W0319 09:23:36.242425 13205 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:23:36.242553 master-0 kubenswrapper[13205]: E0319 09:23:36.242492 13205 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:23:36.371074 master-0 kubenswrapper[13205]: W0319 09:23:36.370684 13205 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:23:36.371074 master-0 kubenswrapper[13205]: E0319 09:23:36.370800 13205 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:23:36.386502 master-0 kubenswrapper[13205]: W0319 09:23:36.386163 13205 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:23:36.386700 master-0 kubenswrapper[13205]: E0319 09:23:36.386513 13205 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:23:36.730984 master-0 kubenswrapper[13205]: I0319 09:23:36.730782 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:36.735367 master-0 kubenswrapper[13205]: I0319 09:23:36.735315 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:36.735470 master-0 kubenswrapper[13205]: I0319 09:23:36.735373 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:36.735470 master-0 kubenswrapper[13205]: I0319 09:23:36.735389 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:36.735470 master-0 kubenswrapper[13205]: I0319 09:23:36.735417 13205 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:23:36.736679 master-0 kubenswrapper[13205]: E0319 09:23:36.736608 13205 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:23:36.795169 master-0 kubenswrapper[13205]: I0319 09:23:36.795099 13205 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:23:37.042677 master-0 kubenswrapper[13205]: I0319 09:23:37.042610 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/0.log" Mar 19 09:23:37.042677 master-0 kubenswrapper[13205]: I0319 09:23:37.042684 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"96cfe0cf7dfe0d98d352c2ad678b9567500f91431662731fe6673b6785c78fae"} Mar 19 09:23:37.154396 master-0 kubenswrapper[13205]: I0319 09:23:37.154330 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 19 09:23:37.154607 master-0 kubenswrapper[13205]: I0319 09:23:37.154559 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:37.156729 master-0 kubenswrapper[13205]: I0319 09:23:37.156670 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:37.156835 master-0 kubenswrapper[13205]: I0319 09:23:37.156742 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:37.156835 master-0 kubenswrapper[13205]: I0319 09:23:37.156760 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:37.630435 master-0 kubenswrapper[13205]: I0319 09:23:37.630351 13205 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:38.057550 master-0 kubenswrapper[13205]: I0319 09:23:38.057454 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"ac3507630eeeca1ec26dca5ed036e3bb","Type":"ContainerStarted","Data":"614bc917df6caced8727707d61735f1ca6262f5504d8db0f43fd1ffc7c30fbd5"} Mar 19 09:23:38.057550 master-0 kubenswrapper[13205]: I0319 09:23:38.057552 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:38.061197 master-0 kubenswrapper[13205]: I0319 09:23:38.061126 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:38.061197 master-0 kubenswrapper[13205]: I0319 09:23:38.061180 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:38.061197 master-0 kubenswrapper[13205]: I0319 09:23:38.061189 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:38.337718 master-0 kubenswrapper[13205]: I0319 09:23:38.337616 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:38.340150 master-0 kubenswrapper[13205]: I0319 09:23:38.340094 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:38.340150 master-0 kubenswrapper[13205]: I0319 09:23:38.340152 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:38.340319 master-0 kubenswrapper[13205]: I0319 09:23:38.340168 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:38.340319 master-0 kubenswrapper[13205]: I0319 09:23:38.340197 13205 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:23:39.070339 master-0 kubenswrapper[13205]: I0319 09:23:39.070179 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"ac3507630eeeca1ec26dca5ed036e3bb","Type":"ContainerStarted","Data":"f4cdd4aa5ee6e9313f94584f1cf8c5b3250ea77616cd4e5e1e587652e3d1cce0"} Mar 19 09:23:39.070339 master-0 kubenswrapper[13205]: I0319 09:23:39.070280 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:39.072380 master-0 kubenswrapper[13205]: I0319 09:23:39.072333 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:39.072496 master-0 kubenswrapper[13205]: I0319 09:23:39.072390 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:39.072496 master-0 kubenswrapper[13205]: I0319 09:23:39.072407 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:41.084246 master-0 kubenswrapper[13205]: I0319 09:23:41.084179 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"ac3507630eeeca1ec26dca5ed036e3bb","Type":"ContainerStarted","Data":"be92726d6f0c98f7c12d2824a7420cf3610a10c520bd61727829d8dfca05705b"} Mar 19 09:23:41.638058 master-0 kubenswrapper[13205]: I0319 09:23:41.637985 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:41.638252 master-0 kubenswrapper[13205]: I0319 09:23:41.638211 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:41.641326 master-0 kubenswrapper[13205]: I0319 09:23:41.641287 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:41.641326 master-0 kubenswrapper[13205]: I0319 09:23:41.641324 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:41.641449 master-0 kubenswrapper[13205]: I0319 09:23:41.641332 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:41.643940 master-0 kubenswrapper[13205]: I0319 09:23:41.643897 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:42.090161 master-0 kubenswrapper[13205]: I0319 09:23:42.089820 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:42.093076 master-0 kubenswrapper[13205]: I0319 09:23:42.093014 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:42.093076 master-0 kubenswrapper[13205]: I0319 09:23:42.093066 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:42.093076 master-0 kubenswrapper[13205]: I0319 09:23:42.093079 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:42.272096 master-0 kubenswrapper[13205]: I0319 09:23:42.271994 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:42.272384 master-0 kubenswrapper[13205]: I0319 09:23:42.272333 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:42.275308 master-0 kubenswrapper[13205]: I0319 09:23:42.275250 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:42.275308 master-0 kubenswrapper[13205]: I0319 09:23:42.275303 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:42.275308 master-0 kubenswrapper[13205]: I0319 09:23:42.275316 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:42.900884 master-0 kubenswrapper[13205]: I0319 09:23:42.900801 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 19 09:23:42.901121 master-0 kubenswrapper[13205]: I0319 09:23:42.901038 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:42.903618 master-0 kubenswrapper[13205]: I0319 09:23:42.903572 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:42.903618 master-0 kubenswrapper[13205]: I0319 09:23:42.903614 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:42.903618 master-0 kubenswrapper[13205]: I0319 09:23:42.903625 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:43.101640 master-0 kubenswrapper[13205]: I0319 09:23:43.101581 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"ac3507630eeeca1ec26dca5ed036e3bb","Type":"ContainerStarted","Data":"436d58e6f0459cedb2b40144fa2e1b34bdded6188e821f819ffb9f703598e7f8"} Mar 19 09:23:43.901471 master-0 kubenswrapper[13205]: I0319 09:23:43.901383 13205 patch_prober.go:28] interesting pod/etcd-master-0 container/etcd namespace/openshift-etcd: Startup probe status=failure output="Get \"https://192.168.32.10:9980/readyz\": context deadline exceeded" start-of-body= Mar 19 09:23:43.901471 master-0 kubenswrapper[13205]: I0319 09:23:43.901447 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" probeResult="failure" output="Get \"https://192.168.32.10:9980/readyz\": context deadline exceeded" Mar 19 09:23:44.012675 master-0 kubenswrapper[13205]: I0319 09:23:44.012570 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:44.012948 master-0 kubenswrapper[13205]: I0319 09:23:44.012721 13205 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:23:44.012948 master-0 kubenswrapper[13205]: I0319 09:23:44.012756 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:44.014624 master-0 kubenswrapper[13205]: I0319 09:23:44.014573 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:44.014624 master-0 kubenswrapper[13205]: I0319 09:23:44.014621 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:44.014624 master-0 kubenswrapper[13205]: I0319 09:23:44.014632 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:44.016728 master-0 kubenswrapper[13205]: I0319 09:23:44.016689 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:44.105508 master-0 kubenswrapper[13205]: I0319 09:23:44.105459 13205 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:23:44.105508 master-0 kubenswrapper[13205]: I0319 09:23:44.105504 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:44.107427 master-0 kubenswrapper[13205]: I0319 09:23:44.107348 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:44.107427 master-0 kubenswrapper[13205]: I0319 09:23:44.107400 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:44.107427 master-0 kubenswrapper[13205]: I0319 09:23:44.107410 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:44.659820 master-0 kubenswrapper[13205]: I0319 09:23:44.659639 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:44.664434 master-0 kubenswrapper[13205]: I0319 09:23:44.664383 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:44.840260 master-0 kubenswrapper[13205]: I0319 09:23:44.840184 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:45.109524 master-0 kubenswrapper[13205]: I0319 09:23:45.109482 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:45.111466 master-0 kubenswrapper[13205]: I0319 09:23:45.111433 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:45.111569 master-0 kubenswrapper[13205]: I0319 09:23:45.111470 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:45.111569 master-0 kubenswrapper[13205]: I0319 09:23:45.111481 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:45.223847 master-0 kubenswrapper[13205]: E0319 09:23:45.223781 13205 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:23:46.114219 master-0 kubenswrapper[13205]: I0319 09:23:46.114164 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:46.116389 master-0 kubenswrapper[13205]: I0319 09:23:46.116348 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:46.116473 master-0 kubenswrapper[13205]: I0319 09:23:46.116408 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:46.116473 master-0 kubenswrapper[13205]: I0319 09:23:46.116439 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:46.118305 master-0 kubenswrapper[13205]: I0319 09:23:46.118274 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:23:47.124101 master-0 kubenswrapper[13205]: I0319 09:23:47.124038 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:47.126186 master-0 kubenswrapper[13205]: I0319 09:23:47.126138 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"ac3507630eeeca1ec26dca5ed036e3bb","Type":"ContainerStarted","Data":"a049db050ca6bc5dd5515a0d06b921d1384ccbf52c62dfbd39beb94582593630"} Mar 19 09:23:47.126640 master-0 kubenswrapper[13205]: I0319 09:23:47.126609 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:47.126640 master-0 kubenswrapper[13205]: I0319 09:23:47.126638 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:47.126737 master-0 kubenswrapper[13205]: I0319 09:23:47.126653 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:48.127862 master-0 kubenswrapper[13205]: I0319 09:23:48.127821 13205 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:48.132434 master-0 kubenswrapper[13205]: I0319 09:23:48.132404 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:23:48.132577 master-0 kubenswrapper[13205]: I0319 09:23:48.132440 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:23:48.132577 master-0 kubenswrapper[13205]: I0319 09:23:48.132450 13205 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:23:48.641661 master-0 kubenswrapper[13205]: E0319 09:23:48.641273 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="3.2s" Mar 19 09:23:48.643614 master-0 kubenswrapper[13205]: I0319 09:23:48.643491 13205 trace.go:236] Trace[198980372]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 09:23:38.291) (total time: 10351ms): Mar 19 09:23:48.643614 master-0 kubenswrapper[13205]: Trace[198980372]: ---"Objects listed" error: 10351ms (09:23:48.643) Mar 19 09:23:48.643614 master-0 kubenswrapper[13205]: Trace[198980372]: [10.351536133s] [10.351536133s] END Mar 19 09:23:48.643614 master-0 kubenswrapper[13205]: I0319 09:23:48.643545 13205 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:23:48.644366 master-0 kubenswrapper[13205]: I0319 09:23:48.644331 13205 trace.go:236] Trace[189418098]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 09:23:38.053) (total time: 10590ms): Mar 19 09:23:48.644366 master-0 kubenswrapper[13205]: Trace[189418098]: ---"Objects listed" error: 10590ms (09:23:48.644) Mar 19 09:23:48.644366 master-0 kubenswrapper[13205]: Trace[189418098]: [10.590448438s] [10.590448438s] END Mar 19 09:23:48.644547 master-0 kubenswrapper[13205]: I0319 09:23:48.644367 13205 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:23:48.644547 master-0 kubenswrapper[13205]: I0319 09:23:48.644461 13205 trace.go:236] Trace[1546452573]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 09:23:37.712) (total time: 10931ms): Mar 19 09:23:48.644547 master-0 kubenswrapper[13205]: Trace[1546452573]: ---"Objects listed" error: 10931ms (09:23:48.644) Mar 19 09:23:48.644547 master-0 kubenswrapper[13205]: Trace[1546452573]: [10.931896262s] [10.931896262s] END Mar 19 09:23:48.644547 master-0 kubenswrapper[13205]: I0319 09:23:48.644477 13205 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:23:48.797004 master-0 kubenswrapper[13205]: I0319 09:23:48.796938 13205 apiserver.go:52] "Watching apiserver" Mar 19 09:23:48.811402 master-0 kubenswrapper[13205]: I0319 09:23:48.811346 13205 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:23:48.812655 master-0 kubenswrapper[13205]: I0319 09:23:48.812598 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-kwrpk","openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9","openshift-multus/multus-additional-cni-plugins-8kv6s","openshift-network-operator/network-operator-7bd846bfc4-b4d28","openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47","openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692","openshift-kube-storage-version-migrator/migrator-8487694857-g9497","openshift-marketplace/redhat-operators-4gs4g","openshift-kube-apiserver/kube-apiserver-master-0","openshift-marketplace/certified-operators-xr42z","openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d","openshift-multus/network-metrics-daemon-nq9vs","openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7","openshift-service-ca/service-ca-79bc6b8d76-l54xv","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb","openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7","openshift-marketplace/community-operators-wfkb9","openshift-marketplace/marketplace-operator-89ccd998f-gxznr","openshift-network-node-identity/network-node-identity-slmgx","openshift-controller-manager/controller-manager-6c8fd866bf-g46sj","openshift-kube-scheduler/installer-4-master-0","openshift-marketplace/community-operators-2ct9k","openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr","openshift-etcd/installer-1-master-0","openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6","openshift-apiserver/apiserver-7dcf67dd86-6hgld","openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt","openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59","openshift-kube-apiserver/installer-1-master-0","openshift-kube-controller-manager/installer-1-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-marketplace/certified-operators-7dmw4","openshift-multus/multus-bzdzd","openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk","openshift-kube-apiserver/installer-1-retry-1-master-0","openshift-cluster-version/cluster-version-operator-56d8475767-prd2q","openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj","openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5","openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd","openshift-etcd/etcd-master-0","openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-ovn-kubernetes/ovnkube-node-vcxjs","openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9","openshift-network-diagnostics/network-check-target-4s5vc","openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm","openshift-network-operator/iptables-alerter-qfc76","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm","openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898","openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-marketplace/redhat-marketplace-995hm","openshift-dns-operator/dns-operator-9c5679d8f-cbw4r","openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd"] Mar 19 09:23:48.812913 master-0 kubenswrapper[13205]: I0319 09:23:48.812868 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-kwrpk" Mar 19 09:23:48.817859 master-0 kubenswrapper[13205]: I0319 09:23:48.817808 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:23:48.818758 master-0 kubenswrapper[13205]: I0319 09:23:48.818715 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:23:48.818986 master-0 kubenswrapper[13205]: I0319 09:23:48.818890 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.818986 master-0 kubenswrapper[13205]: I0319 09:23:48.818923 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 09:23:48.821795 master-0 kubenswrapper[13205]: I0319 09:23:48.819300 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:23:48.821795 master-0 kubenswrapper[13205]: I0319 09:23:48.819304 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:23:48.829468 master-0 kubenswrapper[13205]: I0319 09:23:48.829420 13205 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="dbae0633-6a68-4488-aa81-b650c4c8d698" Mar 19 09:23:48.832805 master-0 kubenswrapper[13205]: I0319 09:23:48.831127 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:23:48.832805 master-0 kubenswrapper[13205]: I0319 09:23:48.831233 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.832805 master-0 kubenswrapper[13205]: I0319 09:23:48.831420 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:23:48.832805 master-0 kubenswrapper[13205]: I0319 09:23:48.831639 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.832805 master-0 kubenswrapper[13205]: I0319 09:23:48.831882 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:23:48.832805 master-0 kubenswrapper[13205]: I0319 09:23:48.832586 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 09:23:48.832805 master-0 kubenswrapper[13205]: I0319 09:23:48.832589 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:23:48.832805 master-0 kubenswrapper[13205]: I0319 09:23:48.832689 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:23:48.832805 master-0 kubenswrapper[13205]: I0319 09:23:48.832716 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.832805 master-0 kubenswrapper[13205]: I0319 09:23:48.832809 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.833296 master-0 kubenswrapper[13205]: I0319 09:23:48.832841 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:23:48.833296 master-0 kubenswrapper[13205]: I0319 09:23:48.832854 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:23:48.833296 master-0 kubenswrapper[13205]: I0319 09:23:48.832875 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:23:48.833296 master-0 kubenswrapper[13205]: I0319 09:23:48.832958 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:23:48.833296 master-0 kubenswrapper[13205]: I0319 09:23:48.832963 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:23:48.833296 master-0 kubenswrapper[13205]: I0319 09:23:48.833032 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:23:48.833296 master-0 kubenswrapper[13205]: I0319 09:23:48.833071 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.833296 master-0 kubenswrapper[13205]: I0319 09:23:48.833112 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:23:48.833296 master-0 kubenswrapper[13205]: I0319 09:23:48.833123 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:23:48.833652 master-0 kubenswrapper[13205]: I0319 09:23:48.833398 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 09:23:48.833652 master-0 kubenswrapper[13205]: I0319 09:23:48.833567 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:23:48.833652 master-0 kubenswrapper[13205]: I0319 09:23:48.833586 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:23:48.833652 master-0 kubenswrapper[13205]: I0319 09:23:48.833595 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:23:48.833652 master-0 kubenswrapper[13205]: I0319 09:23:48.833608 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.833571 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.833715 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.833744 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.833786 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.833850 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.833887 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.833898 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.833963 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.833991 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.834049 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.834061 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.834097 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.834100 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 09:23:48.834126 master-0 kubenswrapper[13205]: I0319 09:23:48.834140 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:23:48.834537 master-0 kubenswrapper[13205]: I0319 09:23:48.834175 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:23:48.834991 master-0 kubenswrapper[13205]: I0319 09:23:48.834822 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 09:23:48.835963 master-0 kubenswrapper[13205]: I0319 09:23:48.835924 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:23:48.835963 master-0 kubenswrapper[13205]: I0319 09:23:48.835934 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:23:48.836613 master-0 kubenswrapper[13205]: I0319 09:23:48.836583 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:23:48.840796 master-0 kubenswrapper[13205]: I0319 09:23:48.840752 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.841611 master-0 kubenswrapper[13205]: I0319 09:23:48.841589 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.841703 master-0 kubenswrapper[13205]: I0319 09:23:48.841696 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 09:23:48.842167 master-0 kubenswrapper[13205]: I0319 09:23:48.842126 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:23:48.842204 master-0 kubenswrapper[13205]: I0319 09:23:48.842173 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:23:48.844605 master-0 kubenswrapper[13205]: I0319 09:23:48.844509 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:23:48.847486 master-0 kubenswrapper[13205]: I0319 09:23:48.847448 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:23:48.850978 master-0 kubenswrapper[13205]: I0319 09:23:48.850938 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:23:48.851158 master-0 kubenswrapper[13205]: I0319 09:23:48.851016 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 09:23:48.851158 master-0 kubenswrapper[13205]: I0319 09:23:48.851128 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:23:48.851256 master-0 kubenswrapper[13205]: I0319 09:23:48.851231 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:23:48.851318 master-0 kubenswrapper[13205]: I0319 09:23:48.851297 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:23:48.851385 master-0 kubenswrapper[13205]: I0319 09:23:48.851362 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:23:48.851582 master-0 kubenswrapper[13205]: I0319 09:23:48.851479 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:23:48.851582 master-0 kubenswrapper[13205]: I0319 09:23:48.851509 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:23:48.851694 master-0 kubenswrapper[13205]: I0319 09:23:48.851656 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:23:48.851694 master-0 kubenswrapper[13205]: I0319 09:23:48.851687 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:23:48.851763 master-0 kubenswrapper[13205]: I0319 09:23:48.851702 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:23:48.851830 master-0 kubenswrapper[13205]: I0319 09:23:48.851813 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:23:48.851869 master-0 kubenswrapper[13205]: I0319 09:23:48.851833 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:23:48.851994 master-0 kubenswrapper[13205]: I0319 09:23:48.851976 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 09:23:48.852139 master-0 kubenswrapper[13205]: I0319 09:23:48.852043 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:23:48.852174 master-0 kubenswrapper[13205]: I0319 09:23:48.852152 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:23:48.852204 master-0 kubenswrapper[13205]: I0319 09:23:48.852191 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:23:48.852232 master-0 kubenswrapper[13205]: I0319 09:23:48.852219 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:23:48.852338 master-0 kubenswrapper[13205]: I0319 09:23:48.852309 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:23:48.852443 master-0 kubenswrapper[13205]: I0319 09:23:48.852428 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:23:48.852493 master-0 kubenswrapper[13205]: I0319 09:23:48.852446 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:23:48.852544 master-0 kubenswrapper[13205]: I0319 09:23:48.852489 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.852579 master-0 kubenswrapper[13205]: I0319 09:23:48.852478 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.852607 master-0 kubenswrapper[13205]: I0319 09:23:48.852425 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:23:48.853068 master-0 kubenswrapper[13205]: I0319 09:23:48.853043 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 09:23:48.854189 master-0 kubenswrapper[13205]: I0319 09:23:48.854152 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:23:48.854377 master-0 kubenswrapper[13205]: I0319 09:23:48.854352 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:23:48.854491 master-0 kubenswrapper[13205]: I0319 09:23:48.854445 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:23:48.856665 master-0 kubenswrapper[13205]: I0319 09:23:48.856587 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:23:48.858509 master-0 kubenswrapper[13205]: I0319 09:23:48.858487 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 09:23:48.861137 master-0 kubenswrapper[13205]: I0319 09:23:48.861122 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:23:48.861390 master-0 kubenswrapper[13205]: I0319 09:23:48.861360 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:23:48.861461 master-0 kubenswrapper[13205]: I0319 09:23:48.861371 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:23:48.861652 master-0 kubenswrapper[13205]: I0319 09:23:48.861629 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 09:23:48.861739 master-0 kubenswrapper[13205]: I0319 09:23:48.861710 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:23:48.861790 master-0 kubenswrapper[13205]: I0319 09:23:48.861713 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:23:48.861822 master-0 kubenswrapper[13205]: I0319 09:23:48.861808 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:23:48.863488 master-0 kubenswrapper[13205]: I0319 09:23:48.863468 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 19 09:23:48.863943 master-0 kubenswrapper[13205]: I0319 09:23:48.863914 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:23:48.877005 master-0 kubenswrapper[13205]: I0319 09:23:48.872206 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:23:48.877005 master-0 kubenswrapper[13205]: I0319 09:23:48.873133 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:23:48.877005 master-0 kubenswrapper[13205]: I0319 09:23:48.873275 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:23:48.888650 master-0 kubenswrapper[13205]: I0319 09:23:48.880067 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:23:48.888650 master-0 kubenswrapper[13205]: I0319 09:23:48.880593 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:23:48.888650 master-0 kubenswrapper[13205]: I0319 09:23:48.880762 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:23:48.888650 master-0 kubenswrapper[13205]: I0319 09:23:48.880933 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:23:48.888650 master-0 kubenswrapper[13205]: I0319 09:23:48.881259 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:23:48.888650 master-0 kubenswrapper[13205]: I0319 09:23:48.881422 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:23:48.888650 master-0 kubenswrapper[13205]: I0319 09:23:48.885951 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:23:48.888650 master-0 kubenswrapper[13205]: I0319 09:23:48.886604 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:23:48.888650 master-0 kubenswrapper[13205]: I0319 09:23:48.888047 13205 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 09:23:48.889075 master-0 kubenswrapper[13205]: I0319 09:23:48.888835 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 09:23:48.889075 master-0 kubenswrapper[13205]: I0319 09:23:48.889059 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 09:23:48.899706 master-0 kubenswrapper[13205]: I0319 09:23:48.899609 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 09:23:48.911243 master-0 kubenswrapper[13205]: I0319 09:23:48.911175 13205 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 19 09:23:48.919633 master-0 kubenswrapper[13205]: I0319 09:23:48.919598 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 09:23:48.940144 master-0 kubenswrapper[13205]: I0319 09:23:48.940087 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 09:23:48.959935 master-0 kubenswrapper[13205]: I0319 09:23:48.959896 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 09:23:48.979283 master-0 kubenswrapper[13205]: I0319 09:23:48.979236 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.996963 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997029 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvnb9\" (UniqueName: \"kubernetes.io/projected/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-kube-api-access-lvnb9\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997051 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-utilities\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997068 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d664acc4-ec4f-4078-ae93-404a14ea18fc-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997085 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgtgw\" (UniqueName: \"kubernetes.io/projected/8beda3a0-a653-4810-b3f2-d25badb21ab1-kube-api-access-tgtgw\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997102 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww85l\" (UniqueName: \"kubernetes.io/projected/dc65ec1f-b8fb-40d6-ac39-46b255a33221-kube-api-access-ww85l\") pod \"csi-snapshot-controller-64854d9cff-v9s9c\" (UID: \"dc65ec1f-b8fb-40d6-ac39-46b255a33221\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997116 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-utilities\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997132 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-config\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997148 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997167 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3374940a-612d-4335-8236-3ffe8d6e73a5-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997188 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-config\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997204 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-catalog-content\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997220 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-daemon-config\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997242 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-multus-certs\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997259 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-slash\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997282 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997300 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-client\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997318 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bdnt\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-kube-api-access-6bdnt\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997333 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-node-log\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997349 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf5jl\" (UniqueName: \"kubernetes.io/projected/64f60856-22dd-4560-acff-c620e17844a1-kube-api-access-cf5jl\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997365 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-os-release\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997381 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c75102-6790-4ed3-84da-61c3611186f8-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997397 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51b88818-5108-40db-90c8-4f2e7198959e-kube-api-access\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997421 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3374940a-612d-4335-8236-3ffe8d6e73a5-cache\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997449 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-var-lib-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997473 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3b50118d-f7c2-4bff-aca0-5c6623819baf-operand-assets\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997490 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997507 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997521 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-etcd-serving-ca\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997554 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-log-socket\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997575 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f65184f-8fc2-4656-8776-a3b962aa1f5d-iptables-alerter-script\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997594 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-trusted-ca-bundle\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997621 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-config\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997637 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/16d2930b-486b-492d-983e-c6702d8f53a7-kube-api-access-h5hk6\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:23:48.997600 master-0 kubenswrapper[13205]: I0319 09:23:48.997653 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997670 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997686 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/f585ebb1-6210-463b-af85-fb29e1e7dfa5-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997703 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7fae040-28fa-4d97-8482-fd0dd12cc921-serving-cert\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997722 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/3374940a-612d-4335-8236-3ffe8d6e73a5-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997738 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpcn\" (UniqueName: \"kubernetes.io/projected/3374940a-612d-4335-8236-3ffe8d6e73a5-kube-api-access-kmpcn\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997756 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-etcd-client\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997771 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a3dddb56-d180-4b8a-85bd-77c3888d8f71-signing-cabundle\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997787 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-system-cni-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997801 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-kubelet\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997816 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-netns\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997831 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997876 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6sr8\" (UniqueName: \"kubernetes.io/projected/c5966fa8-b9f0-42ee-a75b-20014782366d-kube-api-access-v6sr8\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997910 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-utilities\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997938 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-ovnkube-identity-cm\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997955 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997970 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.997986 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw2x6\" (UniqueName: \"kubernetes.io/projected/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-kube-api-access-jw2x6\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998001 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998018 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998051 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp9jf\" (UniqueName: \"kubernetes.io/projected/8527f5cd-2992-44be-90b8-e9086cedf46e-kube-api-access-qp9jf\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998075 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5e311c-1c6a-4d5d-8c2b-493025593934-serving-cert\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998094 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998116 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cb2a3b-40e2-45ee-894a-6c833ee17efd-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998137 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64f60856-22dd-4560-acff-c620e17844a1-audit-dir\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998156 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-k8s-cni-cncf-io\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998177 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqwbw\" (UniqueName: \"kubernetes.io/projected/e7fae040-28fa-4d97-8482-fd0dd12cc921-kube-api-access-jqwbw\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998197 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998219 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5966fa8-b9f0-42ee-a75b-20014782366d-catalog-content\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998241 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51b88818-5108-40db-90c8-4f2e7198959e-service-ca\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998262 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998282 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-proxy-ca-bundles\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998303 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-serving-cert\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998326 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-multus\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998353 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998374 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/3374940a-612d-4335-8236-3ffe8d6e73a5-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998399 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-script-lib\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998423 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbdq\" (UniqueName: \"kubernetes.io/projected/a3dddb56-d180-4b8a-85bd-77c3888d8f71-kube-api-access-nxbdq\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998446 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxfs\" (UniqueName: \"kubernetes.io/projected/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-kube-api-access-djxfs\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998471 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-env-overrides\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998494 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9076d131-644a-4332-8a70-34f6b0f71575-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998516 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-catalog-content\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998570 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7t5\" (UniqueName: \"kubernetes.io/projected/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-kube-api-access-vl7t5\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998595 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-socket-dir-parent\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998620 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vcf6\" (UniqueName: \"kubernetes.io/projected/9076d131-644a-4332-8a70-34f6b0f71575-kube-api-access-2vcf6\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998645 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5f5s\" (UniqueName: \"kubernetes.io/projected/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-kube-api-access-w5f5s\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998670 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998694 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hw6b\" (UniqueName: \"kubernetes.io/projected/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-kube-api-access-8hw6b\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998718 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64f60856-22dd-4560-acff-c620e17844a1-node-pullsecrets\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998740 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-encryption-config\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998766 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvd6f\" (UniqueName: \"kubernetes.io/projected/3b333a1e-2a7f-423a-8b40-99f30c89f740-kube-api-access-xvd6f\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998789 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-serving-cert\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998811 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/f585ebb1-6210-463b-af85-fb29e1e7dfa5-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998834 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5966fa8-b9f0-42ee-a75b-20014782366d-utilities\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998858 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-etc-kubernetes\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998883 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9plst\" (UniqueName: \"kubernetes.io/projected/8e073eb4-67f2-4de7-8848-50da73079dbc-kube-api-access-9plst\") pod \"csi-snapshot-controller-operator-5f5d689c6b-jv8lm\" (UID: \"8e073eb4-67f2-4de7-8848-50da73079dbc\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998931 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998958 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clpb5\" (UniqueName: \"kubernetes.io/projected/13072c08-c77c-4170-9ebe-98d63968747b-kube-api-access-clpb5\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.998986 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999011 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8527f5cd-2992-44be-90b8-e9086cedf46e-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999034 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-etcd-serving-ca\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999057 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a3dddb56-d180-4b8a-85bd-77c3888d8f71-signing-key\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999080 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-utilities\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999106 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0c75102-6790-4ed3-84da-61c3611186f8-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999130 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-env-overrides\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999153 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-client-ca\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999176 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-bound-sa-token\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999196 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-config\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999219 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-cnibin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999240 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-cni-binary-copy\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999263 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j65pb\" (UniqueName: \"kubernetes.io/projected/4f65184f-8fc2-4656-8776-a3b962aa1f5d-kube-api-access-j65pb\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999285 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4abcf2ea-50f5-4d62-8a23-583438e5b451-metrics-tls\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999307 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999328 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovn-node-metrics-cert\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999353 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-systemd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999379 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d664acc4-ec4f-4078-ae93-404a14ea18fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999407 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9blbc\" (UniqueName: \"kubernetes.io/projected/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-kube-api-access-9blbc\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:23:48.999325 master-0 kubenswrapper[13205]: I0319 09:23:48.999430 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b333a1e-2a7f-423a-8b40-99f30c89f740-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:48.999446 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-kubelet\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:48.999463 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:48.999480 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g4rw\" (UniqueName: \"kubernetes.io/projected/f585ebb1-6210-463b-af85-fb29e1e7dfa5-kube-api-access-5g4rw\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:48.999496 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v88k\" (UniqueName: \"kubernetes.io/projected/259794ab-d027-497a-b08e-5a6d79057668-kube-api-access-6v88k\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:48.999514 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b50118d-f7c2-4bff-aca0-5c6623819baf-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.000846 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cb2a3b-40e2-45ee-894a-6c833ee17efd-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.001154 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f65184f-8fc2-4656-8776-a3b962aa1f5d-iptables-alerter-script\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.001444 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.001590 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-config\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.002015 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d664acc4-ec4f-4078-ae93-404a14ea18fc-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.002348 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-config\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.002426 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-utilities\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.002466 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0c75102-6790-4ed3-84da-61c3611186f8-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.002608 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3374940a-612d-4335-8236-3ffe8d6e73a5-cache\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.002655 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6c1523-e77c-4aac-814c-05d41215c42f-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.002684 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/259794ab-d027-497a-b08e-5a6d79057668-srv-cert\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.002732 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3b50118d-f7c2-4bff-aca0-5c6623819baf-operand-assets\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.002802 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-catalog-content\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.002868 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5966fa8-b9f0-42ee-a75b-20014782366d-catalog-content\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003058 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003099 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d2930b-486b-492d-983e-c6702d8f53a7-metrics-tls\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003220 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-utilities\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003290 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51b88818-5108-40db-90c8-4f2e7198959e-service-ca\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003341 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-utilities\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003468 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-serving-cert\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003508 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-env-overrides\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003514 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-config\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003751 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-env-overrides\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003765 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c247d991-809e-46b6-9617-9b05007b7560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003800 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-utilities\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003859 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-catalog-content\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003944 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-daemon-config\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.003993 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/157e3524-eb27-41ca-b49d-2697ee1245ca-cni-binary-copy\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.004028 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.004081 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5966fa8-b9f0-42ee-a75b-20014782366d-utilities\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.004191 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-ovnkube-identity-cm\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:23:49.004484 master-0 kubenswrapper[13205]: I0319 09:23:49.004477 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9076d131-644a-4332-8a70-34f6b0f71575-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:23:49.006108 master-0 kubenswrapper[13205]: I0319 09:23:49.006072 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:23:49.007248 master-0 kubenswrapper[13205]: I0319 09:23:49.007187 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-ca\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:23:49.008726 master-0 kubenswrapper[13205]: I0319 09:23:49.007623 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b333a1e-2a7f-423a-8b40-99f30c89f740-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:23:49.008726 master-0 kubenswrapper[13205]: I0319 09:23:49.008072 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:23:49.008726 master-0 kubenswrapper[13205]: I0319 09:23:49.008550 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:23:49.008896 master-0 kubenswrapper[13205]: I0319 09:23:49.008790 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-serving-cert\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:23:49.010007 master-0 kubenswrapper[13205]: I0319 09:23:49.009044 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9076d131-644a-4332-8a70-34f6b0f71575-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:23:49.010341 master-0 kubenswrapper[13205]: I0319 09:23:49.010056 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a3dddb56-d180-4b8a-85bd-77c3888d8f71-signing-key\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:23:49.010487 master-0 kubenswrapper[13205]: I0319 09:23:49.010396 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8527f5cd-2992-44be-90b8-e9086cedf46e-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:23:49.011442 master-0 kubenswrapper[13205]: I0319 09:23:49.010911 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:23:49.011442 master-0 kubenswrapper[13205]: I0319 09:23:49.011104 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-srv-cert\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:23:49.011442 master-0 kubenswrapper[13205]: I0319 09:23:49.011265 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.011825 master-0 kubenswrapper[13205]: I0319 09:23:49.011472 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7fae040-28fa-4d97-8482-fd0dd12cc921-serving-cert\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:23:49.011825 master-0 kubenswrapper[13205]: I0319 09:23:49.011754 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:23:49.012384 master-0 kubenswrapper[13205]: I0319 09:23:49.012286 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:23:49.012884 master-0 kubenswrapper[13205]: I0319 09:23:49.012788 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-script-lib\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.013369 master-0 kubenswrapper[13205]: I0319 09:23:49.013327 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a3dddb56-d180-4b8a-85bd-77c3888d8f71-signing-cabundle\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:23:49.013441 master-0 kubenswrapper[13205]: I0319 09:23:49.013414 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-image-import-ca\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.013514 master-0 kubenswrapper[13205]: I0319 09:23:49.013489 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:23:49.013671 master-0 kubenswrapper[13205]: I0319 09:23:49.013645 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-cnibin\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.013709 master-0 kubenswrapper[13205]: I0319 09:23:49.013684 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-catalog-content\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:49.013754 master-0 kubenswrapper[13205]: I0319 09:23:49.013736 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/4abcf2ea-50f5-4d62-8a23-583438e5b451-host-etc-kube\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:23:49.013854 master-0 kubenswrapper[13205]: I0319 09:23:49.013771 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.013923 master-0 kubenswrapper[13205]: I0319 09:23:49.013906 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-system-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.013999 master-0 kubenswrapper[13205]: I0319 09:23:49.013964 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-etcd-client\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:23:49.014178 master-0 kubenswrapper[13205]: I0319 09:23:49.014147 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:23:49.014241 master-0 kubenswrapper[13205]: I0319 09:23:49.014208 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/3374940a-612d-4335-8236-3ffe8d6e73a5-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:49.014285 master-0 kubenswrapper[13205]: I0319 09:23:49.014240 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f585ebb1-6210-463b-af85-fb29e1e7dfa5-cache\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:49.014337 master-0 kubenswrapper[13205]: I0319 09:23:49.014290 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:23:49.014337 master-0 kubenswrapper[13205]: I0319 09:23:49.014317 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:23:49.014410 master-0 kubenswrapper[13205]: I0319 09:23:49.014367 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:23:49.014410 master-0 kubenswrapper[13205]: I0319 09:23:49.014396 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b333a1e-2a7f-423a-8b40-99f30c89f740-config\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:23:49.014490 master-0 kubenswrapper[13205]: I0319 09:23:49.014468 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:23:49.014608 master-0 kubenswrapper[13205]: I0319 09:23:49.014569 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovn-node-metrics-cert\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.014829 master-0 kubenswrapper[13205]: I0319 09:23:49.014796 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-config\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:23:49.015247 master-0 kubenswrapper[13205]: I0319 09:23:49.015217 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-serving-cert\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:49.015326 master-0 kubenswrapper[13205]: I0319 09:23:49.015309 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-config\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.015443 master-0 kubenswrapper[13205]: I0319 09:23:49.015346 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtw68\" (UniqueName: \"kubernetes.io/projected/41659a48-5eea-41cd-8b2a-b683dc15cc11-kube-api-access-jtw68\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:23:49.015443 master-0 kubenswrapper[13205]: I0319 09:23:49.015391 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-trusted-ca-bundle\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.015443 master-0 kubenswrapper[13205]: I0319 09:23:49.015420 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6tp5\" (UniqueName: \"kubernetes.io/projected/9a6c1523-e77c-4aac-814c-05d41215c42f-kube-api-access-m6tp5\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:23:49.015614 master-0 kubenswrapper[13205]: I0319 09:23:49.015448 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-netns\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.015614 master-0 kubenswrapper[13205]: I0319 09:23:49.015475 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:23:49.015614 master-0 kubenswrapper[13205]: I0319 09:23:49.015503 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w7jx\" (UniqueName: \"kubernetes.io/projected/741c9d25-7634-41c0-bfe4-b7a15de4b341-kube-api-access-4w7jx\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:49.015614 master-0 kubenswrapper[13205]: I0319 09:23:49.015547 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6dq\" (UniqueName: \"kubernetes.io/projected/43cb2a3b-40e2-45ee-894a-6c833ee17efd-kube-api-access-vf6dq\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:23:49.015614 master-0 kubenswrapper[13205]: I0319 09:23:49.015574 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-ovn\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.015614 master-0 kubenswrapper[13205]: I0319 09:23:49.015602 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-etcd-client\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.015850 master-0 kubenswrapper[13205]: I0319 09:23:49.015628 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.015850 master-0 kubenswrapper[13205]: I0319 09:23:49.015657 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:23:49.015850 master-0 kubenswrapper[13205]: I0319 09:23:49.015681 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7ppn\" (UniqueName: \"kubernetes.io/projected/5a51c701-7f2a-4332-a301-746e8a0eb475-kube-api-access-g7ppn\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:49.015850 master-0 kubenswrapper[13205]: I0319 09:23:49.015707 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg4cn\" (UniqueName: \"kubernetes.io/projected/56e11aac-d199-404a-a0e2-82c28926746d-kube-api-access-pg4cn\") pod \"migrator-8487694857-g9497\" (UID: \"56e11aac-d199-404a-a0e2-82c28926746d\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-g9497" Mar 19 09:23:49.015850 master-0 kubenswrapper[13205]: I0319 09:23:49.015744 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hxq7\" (UniqueName: \"kubernetes.io/projected/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-kube-api-access-6hxq7\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:49.015850 master-0 kubenswrapper[13205]: I0319 09:23:49.015770 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-bin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.016090 master-0 kubenswrapper[13205]: I0319 09:23:49.015990 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rqsq\" (UniqueName: \"kubernetes.io/projected/3b50118d-f7c2-4bff-aca0-5c6623819baf-kube-api-access-6rqsq\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:23:49.016090 master-0 kubenswrapper[13205]: I0319 09:23:49.016022 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:23:49.016090 master-0 kubenswrapper[13205]: I0319 09:23:49.016040 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:23:49.016090 master-0 kubenswrapper[13205]: I0319 09:23:49.016059 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhzsr\" (UniqueName: \"kubernetes.io/projected/157e3524-eb27-41ca-b49d-2697ee1245ca-kube-api-access-qhzsr\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.016090 master-0 kubenswrapper[13205]: I0319 09:23:49.016076 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8527f5cd-2992-44be-90b8-e9086cedf46e-config\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:23:49.016090 master-0 kubenswrapper[13205]: I0319 09:23:49.016094 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741c9d25-7634-41c0-bfe4-b7a15de4b341-catalog-content\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016115 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-os-release\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016132 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/f585ebb1-6210-463b-af85-fb29e1e7dfa5-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016150 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-systemd-units\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016151 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-catalog-content\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016170 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6j2m\" (UniqueName: \"kubernetes.io/projected/4d2c5580-36f6-4107-af53-cfbd15080b30-kube-api-access-x6j2m\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016199 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c75102-6790-4ed3-84da-61c3611186f8-config\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016216 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a51c701-7f2a-4332-a301-746e8a0eb475-audit-dir\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016233 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-netd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016250 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-client-ca\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016269 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016285 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-webhook-cert\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016303 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kr8w\" (UniqueName: \"kubernetes.io/projected/0bce9154-cd31-4c4a-9d86-2903d5b1adad-kube-api-access-4kr8w\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:49.016314 master-0 kubenswrapper[13205]: I0319 09:23:49.016322 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4hqj\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-kube-api-access-v4hqj\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:23:49.016887 master-0 kubenswrapper[13205]: I0319 09:23:49.016340 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:23:49.016887 master-0 kubenswrapper[13205]: I0319 09:23:49.016357 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-hostroot\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.016887 master-0 kubenswrapper[13205]: I0319 09:23:49.016375 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-encryption-config\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:49.016887 master-0 kubenswrapper[13205]: I0319 09:23:49.016417 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f65184f-8fc2-4656-8776-a3b962aa1f5d-host-slash\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:23:49.016887 master-0 kubenswrapper[13205]: I0319 09:23:49.016437 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:23:49.016887 master-0 kubenswrapper[13205]: I0319 09:23:49.016453 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:23:49.016887 master-0 kubenswrapper[13205]: I0319 09:23:49.016470 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-etc-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.016887 master-0 kubenswrapper[13205]: I0319 09:23:49.016486 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.016887 master-0 kubenswrapper[13205]: I0319 09:23:49.016506 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-env-overrides\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:23:49.016887 master-0 kubenswrapper[13205]: I0319 09:23:49.016568 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:23:49.016887 master-0 kubenswrapper[13205]: I0319 09:23:49.016587 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfq74\" (UniqueName: \"kubernetes.io/projected/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-kube-api-access-sfq74\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:23:49.017475 master-0 kubenswrapper[13205]: I0319 09:23:49.017435 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:23:49.017615 master-0 kubenswrapper[13205]: I0319 09:23:49.017589 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-config\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:23:49.017674 master-0 kubenswrapper[13205]: I0319 09:23:49.017637 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49fpz\" (UniqueName: \"kubernetes.io/projected/1d5e311c-1c6a-4d5d-8c2b-493025593934-kube-api-access-49fpz\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:23:49.017674 master-0 kubenswrapper[13205]: I0319 09:23:49.017668 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.017762 master-0 kubenswrapper[13205]: I0319 09:23:49.017686 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.017762 master-0 kubenswrapper[13205]: I0319 09:23:49.017718 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741c9d25-7634-41c0-bfe4-b7a15de4b341-utilities\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:49.017846 master-0 kubenswrapper[13205]: I0319 09:23:49.017818 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-audit-policies\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:49.017895 master-0 kubenswrapper[13205]: I0319 09:23:49.017854 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hnvh\" (UniqueName: \"kubernetes.io/projected/4abcf2ea-50f5-4d62-8a23-583438e5b451-kube-api-access-2hnvh\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:23:49.017938 master-0 kubenswrapper[13205]: I0319 09:23:49.017918 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43cb2a3b-40e2-45ee-894a-6c833ee17efd-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:23:49.017995 master-0 kubenswrapper[13205]: I0319 09:23:49.017961 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/979d4d12-a560-4309-a1d3-cbebe853e8ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.017995 master-0 kubenswrapper[13205]: I0319 09:23:49.017981 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxjqg\" (UniqueName: \"kubernetes.io/projected/979d4d12-a560-4309-a1d3-cbebe853e8ea-kube-api-access-rxjqg\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.018080 master-0 kubenswrapper[13205]: I0319 09:23:49.018025 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d12dab-1215-4c1f-a9f5-27ea7174d308-trusted-ca\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:23:49.018126 master-0 kubenswrapper[13205]: I0319 09:23:49.018075 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-bin\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.018126 master-0 kubenswrapper[13205]: I0319 09:23:49.018107 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sclqq\" (UniqueName: \"kubernetes.io/projected/67d66357-fcee-4e70-b563-5895b978ab55-kube-api-access-sclqq\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:23:49.018208 master-0 kubenswrapper[13205]: I0319 09:23:49.018161 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d664acc4-ec4f-4078-ae93-404a14ea18fc-config\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:23:49.018208 master-0 kubenswrapper[13205]: I0319 09:23:49.018180 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c247d991-809e-46b6-9617-9b05007b7560-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:23:49.018208 master-0 kubenswrapper[13205]: I0319 09:23:49.018198 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-audit\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.018208 master-0 kubenswrapper[13205]: I0319 09:23:49.018201 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-webhook-cert\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:23:49.018370 master-0 kubenswrapper[13205]: I0319 09:23:49.018238 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d66357-fcee-4e70-b563-5895b978ab55-serving-cert\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:23:49.018370 master-0 kubenswrapper[13205]: I0319 09:23:49.018263 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-catalog-content\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:49.018370 master-0 kubenswrapper[13205]: I0319 09:23:49.018282 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-conf-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.018564 master-0 kubenswrapper[13205]: I0319 09:23:49.018495 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-ovnkube-config\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.018828 master-0 kubenswrapper[13205]: I0319 09:23:49.018800 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-env-overrides\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:23:49.018883 master-0 kubenswrapper[13205]: I0319 09:23:49.018855 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f585ebb1-6210-463b-af85-fb29e1e7dfa5-cache\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:49.019067 master-0 kubenswrapper[13205]: I0319 09:23:49.019036 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41659a48-5eea-41cd-8b2a-b683dc15cc11-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:23:49.019067 master-0 kubenswrapper[13205]: I0319 09:23:49.019052 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fae040-28fa-4d97-8482-fd0dd12cc921-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:23:49.019747 master-0 kubenswrapper[13205]: I0319 09:23:49.019594 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13072c08-c77c-4170-9ebe-98d63968747b-metrics-certs\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:23:49.019747 master-0 kubenswrapper[13205]: I0319 09:23:49.019666 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:23:49.019902 master-0 kubenswrapper[13205]: I0319 09:23:49.019876 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:23:49.020113 master-0 kubenswrapper[13205]: I0319 09:23:49.020077 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03d12dab-1215-4c1f-a9f5-27ea7174d308-metrics-tls\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:23:49.020698 master-0 kubenswrapper[13205]: I0319 09:23:49.020673 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d12dab-1215-4c1f-a9f5-27ea7174d308-trusted-ca\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:23:49.021147 master-0 kubenswrapper[13205]: I0319 09:23:49.020854 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741c9d25-7634-41c0-bfe4-b7a15de4b341-utilities\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:49.021147 master-0 kubenswrapper[13205]: I0319 09:23:49.020964 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b333a1e-2a7f-423a-8b40-99f30c89f740-config\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:23:49.021147 master-0 kubenswrapper[13205]: I0319 09:23:49.020992 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-audit-policies\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:49.021147 master-0 kubenswrapper[13205]: I0319 09:23:49.021041 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d664acc4-ec4f-4078-ae93-404a14ea18fc-config\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:23:49.021277 master-0 kubenswrapper[13205]: I0319 09:23:49.021152 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43cb2a3b-40e2-45ee-894a-6c833ee17efd-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:23:49.021470 master-0 kubenswrapper[13205]: I0319 09:23:49.021416 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-catalog-content\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:49.021470 master-0 kubenswrapper[13205]: I0319 09:23:49.021436 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b50118d-f7c2-4bff-aca0-5c6623819baf-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:23:49.021637 master-0 kubenswrapper[13205]: I0319 09:23:49.021480 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741c9d25-7634-41c0-bfe4-b7a15de4b341-catalog-content\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:49.021702 master-0 kubenswrapper[13205]: I0319 09:23:49.021664 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:23:49.021758 master-0 kubenswrapper[13205]: I0319 09:23:49.021739 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8527f5cd-2992-44be-90b8-e9086cedf46e-config\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:23:49.021956 master-0 kubenswrapper[13205]: I0319 09:23:49.021811 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0c75102-6790-4ed3-84da-61c3611186f8-config\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:23:49.021956 master-0 kubenswrapper[13205]: I0319 09:23:49.021814 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c247d991-809e-46b6-9617-9b05007b7560-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:23:49.022178 master-0 kubenswrapper[13205]: I0319 09:23:49.022137 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:23:49.022931 master-0 kubenswrapper[13205]: I0319 09:23:49.022796 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4abcf2ea-50f5-4d62-8a23-583438e5b451-metrics-tls\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:23:49.023272 master-0 kubenswrapper[13205]: I0319 09:23:49.023236 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:23:49.052574 master-0 kubenswrapper[13205]: I0319 09:23:49.038854 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:23:49.052574 master-0 kubenswrapper[13205]: I0319 09:23:49.039239 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-encryption-config\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:49.069672 master-0 kubenswrapper[13205]: I0319 09:23:49.069186 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:23:49.069855 master-0 kubenswrapper[13205]: I0319 09:23:49.069841 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-etcd-serving-ca\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:49.102922 master-0 kubenswrapper[13205]: I0319 09:23:49.102861 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:23:49.103200 master-0 kubenswrapper[13205]: I0319 09:23:49.103169 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:23:49.104987 master-0 kubenswrapper[13205]: I0319 09:23:49.104951 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a51c701-7f2a-4332-a301-746e8a0eb475-trusted-ca-bundle\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129103 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/f585ebb1-6210-463b-af85-fb29e1e7dfa5-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129183 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-system-cni-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129212 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-kubelet\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129281 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-kubelet\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129312 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-system-cni-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129413 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/f585ebb1-6210-463b-af85-fb29e1e7dfa5-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129443 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-netns\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129475 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-netns\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129577 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64f60856-22dd-4560-acff-c620e17844a1-audit-dir\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129656 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64f60856-22dd-4560-acff-c620e17844a1-audit-dir\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129712 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-k8s-cni-cncf-io\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129781 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-multus\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129805 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/3374940a-612d-4335-8236-3ffe8d6e73a5-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129873 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-socket-dir-parent\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129902 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64f60856-22dd-4560-acff-c620e17844a1-node-pullsecrets\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129957 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-etc-kubernetes\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.129984 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130054 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-cnibin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130081 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-systemd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130124 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-kubelet\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130161 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-cnibin\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130185 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/4abcf2ea-50f5-4d62-8a23-583438e5b451-host-etc-kube\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130204 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-system-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130226 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/3374940a-612d-4335-8236-3ffe8d6e73a5-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130249 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130317 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-netns\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130356 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-ovn\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130423 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-bin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130446 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-os-release\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130449 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-cnibin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130482 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a51c701-7f2a-4332-a301-746e8a0eb475-audit-dir\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130501 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/f585ebb1-6210-463b-af85-fb29e1e7dfa5-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130558 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-systemd-units\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130609 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-netd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130643 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-os-release\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130661 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130688 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-systemd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130703 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130715 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-hostroot\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130741 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130768 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f65184f-8fc2-4656-8776-a3b962aa1f5d-host-slash\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130785 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-multus\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130794 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130819 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130839 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/3374940a-612d-4335-8236-3ffe8d6e73a5-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130846 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-etc-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130884 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-etc-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130919 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-kubelet\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130945 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-cnibin\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130964 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-socket-dir-parent\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.131042 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/4abcf2ea-50f5-4d62-8a23-583438e5b451-host-etc-kube\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:23:49.132772 master-0 kubenswrapper[13205]: I0319 09:23:49.130739 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-k8s-cni-cncf-io\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133354 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/979d4d12-a560-4309-a1d3-cbebe853e8ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133431 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a51c701-7f2a-4332-a301-746e8a0eb475-audit-dir\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133472 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/f585ebb1-6210-463b-af85-fb29e1e7dfa5-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133516 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-systemd-units\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133581 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-netd\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133617 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/3374940a-612d-4335-8236-3ffe8d6e73a5-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133655 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-system-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133674 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133713 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133764 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-bin\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133801 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-conf-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133861 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-slash\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133883 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-multus-certs\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133921 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-node-log\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133951 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-os-release\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133975 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-var-lib-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.133996 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-log-socket\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134022 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134463 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-netns\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134483 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-hostroot\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134495 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-run-multus-certs\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134619 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-etc-kubernetes\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134649 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-ovn\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134676 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-host-var-lib-cni-bin\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134664 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134715 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-slash\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134749 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134756 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-os-release\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134786 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-node-log\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134789 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f65184f-8fc2-4656-8776-a3b962aa1f5d-host-slash\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134818 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-var-lib-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134825 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-run-ovn-kubernetes\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134845 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-host-cni-bin\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134848 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-conf-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134910 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134961 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64f60856-22dd-4560-acff-c620e17844a1-node-pullsecrets\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.134990 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-run-openvswitch\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.135090 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.135170 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-log-socket\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.135519 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:49.135926 master-0 kubenswrapper[13205]: I0319 09:23:49.135557 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:23:49.142023 master-0 kubenswrapper[13205]: I0319 09:23:49.140213 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-serving-cert\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.142023 master-0 kubenswrapper[13205]: I0319 09:23:49.140257 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 09:23:49.142023 master-0 kubenswrapper[13205]: I0319 09:23:49.140868 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:49.142786 master-0 kubenswrapper[13205]: I0319 09:23:49.142050 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:49.142786 master-0 kubenswrapper[13205]: I0319 09:23:49.142511 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/157e3524-eb27-41ca-b49d-2697ee1245ca-multus-cni-dir\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:49.143216 master-0 kubenswrapper[13205]: I0319 09:23:49.143177 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3374940a-612d-4335-8236-3ffe8d6e73a5-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:49.146258 master-0 kubenswrapper[13205]: I0319 09:23:49.146225 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:49.158592 master-0 kubenswrapper[13205]: I0319 09:23:49.158480 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:23:49.159834 master-0 kubenswrapper[13205]: I0319 09:23:49.159809 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-etcd-client\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.178708 master-0 kubenswrapper[13205]: I0319 09:23:49.178642 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:23:49.181835 master-0 kubenswrapper[13205]: I0319 09:23:49.181799 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-etcd-serving-ca\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.199078 master-0 kubenswrapper[13205]: I0319 09:23:49.199036 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 09:23:49.202472 master-0 kubenswrapper[13205]: I0319 09:23:49.202438 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-audit\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.219892 master-0 kubenswrapper[13205]: I0319 09:23:49.219833 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:23:49.222403 master-0 kubenswrapper[13205]: I0319 09:23:49.222360 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a51c701-7f2a-4332-a301-746e8a0eb475-etcd-client\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:49.234924 master-0 kubenswrapper[13205]: I0319 09:23:49.234296 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-catalog-content\") pod \"4d2c5580-36f6-4107-af53-cfbd15080b30\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " Mar 19 09:23:49.235230 master-0 kubenswrapper[13205]: I0319 09:23:49.235165 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-catalog-content\") pod \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " Mar 19 09:23:49.235338 master-0 kubenswrapper[13205]: I0319 09:23:49.235306 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-utilities\") pod \"4d2c5580-36f6-4107-af53-cfbd15080b30\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " Mar 19 09:23:49.235381 master-0 kubenswrapper[13205]: I0319 09:23:49.235359 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-utilities\") pod \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " Mar 19 09:23:49.236185 master-0 kubenswrapper[13205]: I0319 09:23:49.236131 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-utilities" (OuterVolumeSpecName: "utilities") pod "4d2c5580-36f6-4107-af53-cfbd15080b30" (UID: "4d2c5580-36f6-4107-af53-cfbd15080b30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:23:49.237126 master-0 kubenswrapper[13205]: I0319 09:23:49.237077 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-utilities" (OuterVolumeSpecName: "utilities") pod "0bce9154-cd31-4c4a-9d86-2903d5b1adad" (UID: "0bce9154-cd31-4c4a-9d86-2903d5b1adad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:23:49.239250 master-0 kubenswrapper[13205]: I0319 09:23:49.239207 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:23:49.241260 master-0 kubenswrapper[13205]: I0319 09:23:49.241212 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-image-import-ca\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.259689 master-0 kubenswrapper[13205]: I0319 09:23:49.259652 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:23:49.263449 master-0 kubenswrapper[13205]: I0319 09:23:49.263403 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-config\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.278729 master-0 kubenswrapper[13205]: I0319 09:23:49.278681 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 09:23:49.282834 master-0 kubenswrapper[13205]: I0319 09:23:49.282780 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/64f60856-22dd-4560-acff-c620e17844a1-encryption-config\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.296894 master-0 kubenswrapper[13205]: I0319 09:23:49.296813 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bce9154-cd31-4c4a-9d86-2903d5b1adad" (UID: "0bce9154-cd31-4c4a-9d86-2903d5b1adad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:23:49.299826 master-0 kubenswrapper[13205]: I0319 09:23:49.299737 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d2c5580-36f6-4107-af53-cfbd15080b30" (UID: "4d2c5580-36f6-4107-af53-cfbd15080b30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:23:49.300869 master-0 kubenswrapper[13205]: I0319 09:23:49.300818 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 09:23:49.327807 master-0 kubenswrapper[13205]: I0319 09:23:49.327747 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:23:49.329320 master-0 kubenswrapper[13205]: I0319 09:23:49.329296 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64f60856-22dd-4560-acff-c620e17844a1-trusted-ca-bundle\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:49.338310 master-0 kubenswrapper[13205]: I0319 09:23:49.338267 13205 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:49.338310 master-0 kubenswrapper[13205]: I0319 09:23:49.338303 13205 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-utilities\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:49.338563 master-0 kubenswrapper[13205]: I0319 09:23:49.338317 13205 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce9154-cd31-4c4a-9d86-2903d5b1adad-utilities\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:49.338563 master-0 kubenswrapper[13205]: I0319 09:23:49.338330 13205 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2c5580-36f6-4107-af53-cfbd15080b30-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:49.339242 master-0 kubenswrapper[13205]: I0319 09:23:49.339226 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:23:49.359795 master-0 kubenswrapper[13205]: I0319 09:23:49.359754 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:23:49.368416 master-0 kubenswrapper[13205]: I0319 09:23:49.368379 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:23:49.379468 master-0 kubenswrapper[13205]: I0319 09:23:49.379431 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 09:23:49.404798 master-0 kubenswrapper[13205]: I0319 09:23:49.404750 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 09:23:49.412286 master-0 kubenswrapper[13205]: I0319 09:23:49.412157 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/3374940a-612d-4335-8236-3ffe8d6e73a5-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:49.419456 master-0 kubenswrapper[13205]: I0319 09:23:49.419399 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 09:23:49.439308 master-0 kubenswrapper[13205]: I0319 09:23:49.439258 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kubelet-dir\") pod \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " Mar 19 09:23:49.439308 master-0 kubenswrapper[13205]: I0319 09:23:49.439303 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-var-lock\") pod \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " Mar 19 09:23:49.439592 master-0 kubenswrapper[13205]: I0319 09:23:49.439372 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ff98fb1e-7a1f-4657-b085-743d6f2d28e2" (UID: "ff98fb1e-7a1f-4657-b085-743d6f2d28e2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:49.439592 master-0 kubenswrapper[13205]: I0319 09:23:49.439434 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-var-lock" (OuterVolumeSpecName: "var-lock") pod "ff98fb1e-7a1f-4657-b085-743d6f2d28e2" (UID: "ff98fb1e-7a1f-4657-b085-743d6f2d28e2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:49.439897 master-0 kubenswrapper[13205]: I0319 09:23:49.439868 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:49.439897 master-0 kubenswrapper[13205]: I0319 09:23:49.439892 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:49.445186 master-0 kubenswrapper[13205]: I0319 09:23:49.445149 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 09:23:49.452892 master-0 kubenswrapper[13205]: I0319 09:23:49.452848 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/f585ebb1-6210-463b-af85-fb29e1e7dfa5-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:49.459366 master-0 kubenswrapper[13205]: I0319 09:23:49.459313 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:23:49.478929 master-0 kubenswrapper[13205]: I0319 09:23:49.478891 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 09:23:49.498992 master-0 kubenswrapper[13205]: I0319 09:23:49.498936 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:23:49.503049 master-0 kubenswrapper[13205]: I0319 09:23:49.503015 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d66357-fcee-4e70-b563-5895b978ab55-serving-cert\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:23:49.519283 master-0 kubenswrapper[13205]: I0319 09:23:49.519228 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-wfcn9" Mar 19 09:23:49.539259 master-0 kubenswrapper[13205]: I0319 09:23:49.539198 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:23:49.542119 master-0 kubenswrapper[13205]: I0319 09:23:49.542069 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-client-ca\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:23:49.558937 master-0 kubenswrapper[13205]: I0319 09:23:49.558866 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:23:49.578810 master-0 kubenswrapper[13205]: I0319 09:23:49.578747 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:23:49.587876 master-0 kubenswrapper[13205]: I0319 09:23:49.587820 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-config\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:23:49.600081 master-0 kubenswrapper[13205]: I0319 09:23:49.599363 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:23:49.621787 master-0 kubenswrapper[13205]: I0319 09:23:49.621725 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-224sj" Mar 19 09:23:49.641428 master-0 kubenswrapper[13205]: I0319 09:23:49.641367 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:23:49.643332 master-0 kubenswrapper[13205]: I0319 09:23:49.643293 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5e311c-1c6a-4d5d-8c2b-493025593934-serving-cert\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:23:49.659011 master-0 kubenswrapper[13205]: I0319 09:23:49.658971 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:23:49.678860 master-0 kubenswrapper[13205]: I0319 09:23:49.678757 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:23:49.681380 master-0 kubenswrapper[13205]: I0319 09:23:49.681322 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-config\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:23:49.699632 master-0 kubenswrapper[13205]: I0319 09:23:49.699581 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:23:49.700828 master-0 kubenswrapper[13205]: I0319 09:23:49.700782 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-client-ca\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:23:49.726212 master-0 kubenswrapper[13205]: I0319 09:23:49.726153 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:23:49.740265 master-0 kubenswrapper[13205]: I0319 09:23:49.740107 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-proxy-ca-bundles\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:23:49.741521 master-0 kubenswrapper[13205]: I0319 09:23:49.741461 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:23:49.759282 master-0 kubenswrapper[13205]: I0319 09:23:49.759221 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-s9ktx" Mar 19 09:23:49.780044 master-0 kubenswrapper[13205]: I0319 09:23:49.779987 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-2svn2" Mar 19 09:23:49.799234 master-0 kubenswrapper[13205]: I0319 09:23:49.799183 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-9tw96" Mar 19 09:23:49.819027 master-0 kubenswrapper[13205]: I0319 09:23:49.818967 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-p8jxl" Mar 19 09:23:49.860460 master-0 kubenswrapper[13205]: I0319 09:23:49.860416 13205 trace.go:236] Trace[543693980]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 09:23:38.869) (total time: 10991ms): Mar 19 09:23:49.860460 master-0 kubenswrapper[13205]: Trace[543693980]: ---"Objects listed" error: 10991ms (09:23:49.860) Mar 19 09:23:49.860460 master-0 kubenswrapper[13205]: Trace[543693980]: [10.991235185s] [10.991235185s] END Mar 19 09:23:49.860460 master-0 kubenswrapper[13205]: I0319 09:23:49.860443 13205 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:23:49.863850 master-0 kubenswrapper[13205]: E0319 09:23:49.863805 13205 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.016s" Mar 19 09:23:49.863850 master-0 kubenswrapper[13205]: I0319 09:23:49.863854 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:49.863955 master-0 kubenswrapper[13205]: I0319 09:23:49.863882 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 09:23:49.863955 master-0 kubenswrapper[13205]: I0319 09:23:49.863895 13205 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="dbae0633-6a68-4488-aa81-b650c4c8d698" Mar 19 09:23:49.863955 master-0 kubenswrapper[13205]: I0319 09:23:49.863931 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2ct9k" event={"ID":"4d2c5580-36f6-4107-af53-cfbd15080b30","Type":"ContainerDied","Data":"813a77628cdb690ef9ed760c21cb05d1f17fab6329f59eb55493fe5e4d55f0d3"} Mar 19 09:23:49.863955 master-0 kubenswrapper[13205]: I0319 09:23:49.863952 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 09:23:49.864059 master-0 kubenswrapper[13205]: I0319 09:23:49.863962 13205 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="dbae0633-6a68-4488-aa81-b650c4c8d698" Mar 19 09:23:49.864059 master-0 kubenswrapper[13205]: I0319 09:23:49.863972 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dmw4" event={"ID":"0bce9154-cd31-4c4a-9d86-2903d5b1adad","Type":"ContainerDied","Data":"b34e9a2b33556321559c3a7fd34bd69e2a162921e3a485dc8edf1c710c34dfa7"} Mar 19 09:23:49.864059 master-0 kubenswrapper[13205]: I0319 09:23:49.864007 13205 scope.go:117] "RemoveContainer" containerID="e8716e6b475b5ecb7bf3ee3310e613b5df09c280bbb15165ff8d6e13b5af9e6b" Mar 19 09:23:49.879740 master-0 kubenswrapper[13205]: I0319 09:23:49.879699 13205 scope.go:117] "RemoveContainer" containerID="82319940bf8e72e7e1c996daea2af1c07c45f38503055b429ac09e5abb8f28d6" Mar 19 09:23:49.891381 master-0 kubenswrapper[13205]: I0319 09:23:49.891338 13205 scope.go:117] "RemoveContainer" containerID="20ad0dc2c8fe0c77234c92295139868f3667eee62c0b6d6d6951ddd42c52079f" Mar 19 09:23:49.898308 master-0 kubenswrapper[13205]: I0319 09:23:49.898151 13205 request.go:700] Waited for 1.009744412s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0 Mar 19 09:23:49.901081 master-0 kubenswrapper[13205]: I0319 09:23:49.901048 13205 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 19 09:23:49.901302 master-0 kubenswrapper[13205]: I0319 09:23:49.901288 13205 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 19 09:23:49.902898 master-0 kubenswrapper[13205]: I0319 09:23:49.902816 13205 scope.go:117] "RemoveContainer" containerID="bf149ff2c777ec19da6a404f555dbbecaec9d99f5badeb4692ea25e2aab65ea8" Mar 19 09:23:49.933484 master-0 kubenswrapper[13205]: I0319 09:23:49.933379 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvnb9\" (UniqueName: \"kubernetes.io/projected/d9eb3750-cb7b-4d3c-88bc-d1b68a370872-kube-api-access-lvnb9\") pod \"ovnkube-node-vcxjs\" (UID: \"d9eb3750-cb7b-4d3c-88bc-d1b68a370872\") " pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:49.951371 master-0 kubenswrapper[13205]: I0319 09:23:49.951336 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/16d2930b-486b-492d-983e-c6702d8f53a7-kube-api-access-h5hk6\") pod \"dns-operator-9c5679d8f-cbw4r\" (UID: \"16d2930b-486b-492d-983e-c6702d8f53a7\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-cbw4r" Mar 19 09:23:49.972506 master-0 kubenswrapper[13205]: I0319 09:23:49.972468 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51b88818-5108-40db-90c8-4f2e7198959e-kube-api-access\") pod \"cluster-version-operator-56d8475767-prd2q\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:23:49.994352 master-0 kubenswrapper[13205]: I0319 09:23:49.994294 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf5jl\" (UniqueName: \"kubernetes.io/projected/64f60856-22dd-4560-acff-c620e17844a1-kube-api-access-cf5jl\") pod \"apiserver-7dcf67dd86-6hgld\" (UID: \"64f60856-22dd-4560-acff-c620e17844a1\") " pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:50.010731 master-0 kubenswrapper[13205]: I0319 09:23:50.010673 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqwbw\" (UniqueName: \"kubernetes.io/projected/e7fae040-28fa-4d97-8482-fd0dd12cc921-kube-api-access-jqwbw\") pod \"authentication-operator-5885bfd7f4-k4dfd\" (UID: \"e7fae040-28fa-4d97-8482-fd0dd12cc921\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-k4dfd" Mar 19 09:23:50.031245 master-0 kubenswrapper[13205]: I0319 09:23:50.031192 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clpb5\" (UniqueName: \"kubernetes.io/projected/13072c08-c77c-4170-9ebe-98d63968747b-kube-api-access-clpb5\") pod \"network-metrics-daemon-nq9vs\" (UID: \"13072c08-c77c-4170-9ebe-98d63968747b\") " pod="openshift-multus/network-metrics-daemon-nq9vs" Mar 19 09:23:50.034199 master-0 kubenswrapper[13205]: I0319 09:23:50.034148 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:50.034267 master-0 kubenswrapper[13205]: I0319 09:23:50.034217 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:50.036743 master-0 kubenswrapper[13205]: I0319 09:23:50.036691 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:50.036827 master-0 kubenswrapper[13205]: I0319 09:23:50.036801 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:50.042867 master-0 kubenswrapper[13205]: I0319 09:23:50.042821 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:50.054401 master-0 kubenswrapper[13205]: I0319 09:23:50.054347 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9plst\" (UniqueName: \"kubernetes.io/projected/8e073eb4-67f2-4de7-8848-50da73079dbc-kube-api-access-9plst\") pod \"csi-snapshot-controller-operator-5f5d689c6b-jv8lm\" (UID: \"8e073eb4-67f2-4de7-8848-50da73079dbc\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-jv8lm" Mar 19 09:23:50.058719 master-0 kubenswrapper[13205]: I0319 09:23:50.058691 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:50.059078 master-0 kubenswrapper[13205]: I0319 09:23:50.059056 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:23:50.071977 master-0 kubenswrapper[13205]: I0319 09:23:50.071929 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v88k\" (UniqueName: \"kubernetes.io/projected/259794ab-d027-497a-b08e-5a6d79057668-kube-api-access-6v88k\") pod \"catalog-operator-68f85b4d6c-jg9m5\" (UID: \"259794ab-d027-497a-b08e-5a6d79057668\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:23:50.092869 master-0 kubenswrapper[13205]: E0319 09:23:50.092834 13205 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 09:23:50.093157 master-0 kubenswrapper[13205]: E0319 09:23:50.093136 13205 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 09:23:50.093377 master-0 kubenswrapper[13205]: E0319 09:23:50.093365 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kube-api-access podName:ff98fb1e-7a1f-4657-b085-743d6f2d28e2 nodeName:}" failed. No retries permitted until 2026-03-19 09:23:50.593316354 +0000 UTC m=+15.925623242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "ff98fb1e-7a1f-4657-b085-743d6f2d28e2") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 09:23:50.111837 master-0 kubenswrapper[13205]: I0319 09:23:50.111778 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgtgw\" (UniqueName: \"kubernetes.io/projected/8beda3a0-a653-4810-b3f2-d25badb21ab1-kube-api-access-tgtgw\") pod \"multus-admission-controller-5dbbb8b86f-fvh8d\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:23:50.132461 master-0 kubenswrapper[13205]: I0319 09:23:50.132420 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgbq\" (UniqueName: \"kubernetes.io/projected/10c609bb-136a-4ce2-b9e2-0a03e1a37a62-kube-api-access-tpgbq\") pod \"network-check-target-4s5vc\" (UID: \"10c609bb-136a-4ce2-b9e2-0a03e1a37a62\") " pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:23:50.149500 master-0 kubenswrapper[13205]: I0319 09:23:50.149411 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kube-api-access\") pod \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\" (UID: \"ff98fb1e-7a1f-4657-b085-743d6f2d28e2\") " Mar 19 09:23:50.152843 master-0 kubenswrapper[13205]: I0319 09:23:50.152816 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxfs\" (UniqueName: \"kubernetes.io/projected/8c8ee765-76b8-4cde-8acb-6e5edd1b8149-kube-api-access-djxfs\") pod \"cluster-monitoring-operator-58845fbb57-rtzvj\" (UID: \"8c8ee765-76b8-4cde-8acb-6e5edd1b8149\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-rtzvj" Mar 19 09:23:50.153054 master-0 kubenswrapper[13205]: I0319 09:23:50.152961 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"ff98fb1e-7a1f-4657-b085-743d6f2d28e2","Type":"ContainerDied","Data":"f2b06f4e36c66727358cf033020ec300296b581135453e3576489d12e345e41e"} Mar 19 09:23:50.153054 master-0 kubenswrapper[13205]: I0319 09:23:50.152993 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2b06f4e36c66727358cf033020ec300296b581135453e3576489d12e345e41e" Mar 19 09:23:50.153054 master-0 kubenswrapper[13205]: I0319 09:23:50.153003 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 09:23:50.154774 master-0 kubenswrapper[13205]: I0319 09:23:50.154732 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ff98fb1e-7a1f-4657-b085-743d6f2d28e2" (UID: "ff98fb1e-7a1f-4657-b085-743d6f2d28e2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:23:50.171572 master-0 kubenswrapper[13205]: I0319 09:23:50.171520 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbdq\" (UniqueName: \"kubernetes.io/projected/a3dddb56-d180-4b8a-85bd-77c3888d8f71-kube-api-access-nxbdq\") pod \"service-ca-79bc6b8d76-l54xv\" (UID: \"a3dddb56-d180-4b8a-85bd-77c3888d8f71\") " pod="openshift-service-ca/service-ca-79bc6b8d76-l54xv" Mar 19 09:23:50.191748 master-0 kubenswrapper[13205]: I0319 09:23:50.191621 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6sr8\" (UniqueName: \"kubernetes.io/projected/c5966fa8-b9f0-42ee-a75b-20014782366d-kube-api-access-v6sr8\") pod \"redhat-marketplace-995hm\" (UID: \"c5966fa8-b9f0-42ee-a75b-20014782366d\") " pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:23:50.210001 master-0 kubenswrapper[13205]: I0319 09:23:50.209943 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww85l\" (UniqueName: \"kubernetes.io/projected/dc65ec1f-b8fb-40d6-ac39-46b255a33221-kube-api-access-ww85l\") pod \"csi-snapshot-controller-64854d9cff-v9s9c\" (UID: \"dc65ec1f-b8fb-40d6-ac39-46b255a33221\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" Mar 19 09:23:50.235611 master-0 kubenswrapper[13205]: I0319 09:23:50.235445 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bdnt\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-kube-api-access-6bdnt\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:23:50.251736 master-0 kubenswrapper[13205]: I0319 09:23:50.251675 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7t5\" (UniqueName: \"kubernetes.io/projected/a1098584-43b9-4f2c-83d2-22d95fb7b0c3-kube-api-access-vl7t5\") pod \"etcd-operator-8544cbcf9c-5bddk\" (UID: \"a1098584-43b9-4f2c-83d2-22d95fb7b0c3\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-5bddk" Mar 19 09:23:50.252400 master-0 kubenswrapper[13205]: I0319 09:23:50.252360 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff98fb1e-7a1f-4657-b085-743d6f2d28e2-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:50.273028 master-0 kubenswrapper[13205]: I0319 09:23:50.272973 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d664acc4-ec4f-4078-ae93-404a14ea18fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-zddz9\" (UID: \"d664acc4-ec4f-4078-ae93-404a14ea18fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-zddz9" Mar 19 09:23:50.292300 master-0 kubenswrapper[13205]: I0319 09:23:50.292231 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vcf6\" (UniqueName: \"kubernetes.io/projected/9076d131-644a-4332-8a70-34f6b0f71575-kube-api-access-2vcf6\") pod \"cluster-node-tuning-operator-598fbc5f8f-smksb\" (UID: \"9076d131-644a-4332-8a70-34f6b0f71575\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" Mar 19 09:23:50.333492 master-0 kubenswrapper[13205]: I0319 09:23:50.333409 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:23:50.333709 master-0 kubenswrapper[13205]: I0319 09:23:50.333501 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:23:50.344557 master-0 kubenswrapper[13205]: I0319 09:23:50.344469 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-jg9m5" Mar 19 09:23:50.347960 master-0 kubenswrapper[13205]: I0319 09:23:50.347915 13205 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:23:50.348490 master-0 kubenswrapper[13205]: I0319 09:23:50.348452 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g4rw\" (UniqueName: \"kubernetes.io/projected/f585ebb1-6210-463b-af85-fb29e1e7dfa5-kube-api-access-5g4rw\") pod \"operator-controller-controller-manager-57777556ff-ft7tt\" (UID: \"f585ebb1-6210-463b-af85-fb29e1e7dfa5\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:50.348584 master-0 kubenswrapper[13205]: I0319 09:23:50.348487 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5f5s\" (UniqueName: \"kubernetes.io/projected/dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e-kube-api-access-w5f5s\") pod \"marketplace-operator-89ccd998f-gxznr\" (UID: \"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:23:50.355240 master-0 kubenswrapper[13205]: I0319 09:23:50.355187 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9blbc\" (UniqueName: \"kubernetes.io/projected/1694c93a-9acb-4bec-bfd6-3ec370e7a0b4-kube-api-access-9blbc\") pod \"service-ca-operator-b865698dc-f6kkd\" (UID: \"1694c93a-9acb-4bec-bfd6-3ec370e7a0b4\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-f6kkd" Mar 19 09:23:50.371081 master-0 kubenswrapper[13205]: I0319 09:23:50.370741 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvd6f\" (UniqueName: \"kubernetes.io/projected/3b333a1e-2a7f-423a-8b40-99f30c89f740-kube-api-access-xvd6f\") pod \"openshift-apiserver-operator-d65958b8-55s59\" (UID: \"3b333a1e-2a7f-423a-8b40-99f30c89f740\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-55s59" Mar 19 09:23:50.397678 master-0 kubenswrapper[13205]: I0319 09:23:50.397586 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hw6b\" (UniqueName: \"kubernetes.io/projected/7b29cb7b-26d2-4fab-9e03-2d7fdf937592-kube-api-access-8hw6b\") pod \"olm-operator-5c9796789-rh692\" (UID: \"7b29cb7b-26d2-4fab-9e03-2d7fdf937592\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:23:50.430319 master-0 kubenswrapper[13205]: I0319 09:23:50.430251 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j65pb\" (UniqueName: \"kubernetes.io/projected/4f65184f-8fc2-4656-8776-a3b962aa1f5d-kube-api-access-j65pb\") pod \"iptables-alerter-qfc76\" (UID: \"4f65184f-8fc2-4656-8776-a3b962aa1f5d\") " pod="openshift-network-operator/iptables-alerter-qfc76" Mar 19 09:23:50.436687 master-0 kubenswrapper[13205]: I0319 09:23:50.436625 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0c75102-6790-4ed3-84da-61c3611186f8-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-pvlq6\" (UID: \"f0c75102-6790-4ed3-84da-61c3611186f8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-pvlq6" Mar 19 09:23:50.453817 master-0 kubenswrapper[13205]: I0319 09:23:50.453722 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03d12dab-1215-4c1f-a9f5-27ea7174d308-bound-sa-token\") pod \"ingress-operator-66b84d69b-rvwfh\" (UID: \"03d12dab-1215-4c1f-a9f5-27ea7174d308\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-rvwfh" Mar 19 09:23:50.473031 master-0 kubenswrapper[13205]: I0319 09:23:50.472970 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp9jf\" (UniqueName: \"kubernetes.io/projected/8527f5cd-2992-44be-90b8-e9086cedf46e-kube-api-access-qp9jf\") pod \"openshift-controller-manager-operator-8c94f4649-v9898\" (UID: \"8527f5cd-2992-44be-90b8-e9086cedf46e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-v9898" Mar 19 09:23:50.494011 master-0 kubenswrapper[13205]: I0319 09:23:50.493971 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw2x6\" (UniqueName: \"kubernetes.io/projected/ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2-kube-api-access-jw2x6\") pod \"community-operators-wfkb9\" (UID: \"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2\") " pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:23:50.509847 master-0 kubenswrapper[13205]: I0319 09:23:50.509738 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpcn\" (UniqueName: \"kubernetes.io/projected/3374940a-612d-4335-8236-3ffe8d6e73a5-kube-api-access-kmpcn\") pod \"catalogd-controller-manager-6864dc98f7-r28hm\" (UID: \"3374940a-612d-4335-8236-3ffe8d6e73a5\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:50.534897 master-0 kubenswrapper[13205]: I0319 09:23:50.534859 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfq74\" (UniqueName: \"kubernetes.io/projected/58ea8fcc-29b2-48ef-8629-2ba217c9d70c-kube-api-access-sfq74\") pod \"network-node-identity-slmgx\" (UID: \"58ea8fcc-29b2-48ef-8629-2ba217c9d70c\") " pod="openshift-network-node-identity/network-node-identity-slmgx" Mar 19 09:23:50.553193 master-0 kubenswrapper[13205]: I0319 09:23:50.553136 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kr8w\" (UniqueName: \"kubernetes.io/projected/0bce9154-cd31-4c4a-9d86-2903d5b1adad-kube-api-access-4kr8w\") pod \"certified-operators-7dmw4\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " pod="openshift-marketplace/certified-operators-7dmw4" Mar 19 09:23:50.556198 master-0 kubenswrapper[13205]: I0319 09:23:50.556136 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kr8w\" (UniqueName: \"kubernetes.io/projected/0bce9154-cd31-4c4a-9d86-2903d5b1adad-kube-api-access-4kr8w\") pod \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\" (UID: \"0bce9154-cd31-4c4a-9d86-2903d5b1adad\") " Mar 19 09:23:50.560016 master-0 kubenswrapper[13205]: I0319 09:23:50.559831 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bce9154-cd31-4c4a-9d86-2903d5b1adad-kube-api-access-4kr8w" (OuterVolumeSpecName: "kube-api-access-4kr8w") pod "0bce9154-cd31-4c4a-9d86-2903d5b1adad" (UID: "0bce9154-cd31-4c4a-9d86-2903d5b1adad"). InnerVolumeSpecName "kube-api-access-4kr8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:23:50.579308 master-0 kubenswrapper[13205]: I0319 09:23:50.578915 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4hqj\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-kube-api-access-v4hqj\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:23:50.595173 master-0 kubenswrapper[13205]: I0319 09:23:50.595130 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtw68\" (UniqueName: \"kubernetes.io/projected/41659a48-5eea-41cd-8b2a-b683dc15cc11-kube-api-access-jtw68\") pod \"ovnkube-control-plane-57f769d897-hcnr7\" (UID: \"41659a48-5eea-41cd-8b2a-b683dc15cc11\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" Mar 19 09:23:50.613845 master-0 kubenswrapper[13205]: I0319 09:23:50.613805 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6tp5\" (UniqueName: \"kubernetes.io/projected/9a6c1523-e77c-4aac-814c-05d41215c42f-kube-api-access-m6tp5\") pod \"package-server-manager-7b95f86987-5jsnd\" (UID: \"9a6c1523-e77c-4aac-814c-05d41215c42f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:23:50.624022 master-0 kubenswrapper[13205]: I0319 09:23:50.623469 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:23:50.626559 master-0 kubenswrapper[13205]: I0319 09:23:50.625626 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:23:50.633319 master-0 kubenswrapper[13205]: I0319 09:23:50.633241 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-7qnf9\" (UID: \"e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-7qnf9" Mar 19 09:23:50.634668 master-0 kubenswrapper[13205]: I0319 09:23:50.633358 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:23:50.634668 master-0 kubenswrapper[13205]: I0319 09:23:50.633636 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:23:50.637084 master-0 kubenswrapper[13205]: I0319 09:23:50.636982 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:50.637084 master-0 kubenswrapper[13205]: I0319 09:23:50.637025 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:23:50.637794 master-0 kubenswrapper[13205]: I0319 09:23:50.637382 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:50.638465 master-0 kubenswrapper[13205]: I0319 09:23:50.638447 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:23:50.639259 master-0 kubenswrapper[13205]: I0319 09:23:50.639217 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:23:50.639994 master-0 kubenswrapper[13205]: I0319 09:23:50.639961 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-rh692" Mar 19 09:23:50.653234 master-0 kubenswrapper[13205]: I0319 09:23:50.653175 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w7jx\" (UniqueName: \"kubernetes.io/projected/741c9d25-7634-41c0-bfe4-b7a15de4b341-kube-api-access-4w7jx\") pod \"certified-operators-xr42z\" (UID: \"741c9d25-7634-41c0-bfe4-b7a15de4b341\") " pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:23:50.658839 master-0 kubenswrapper[13205]: I0319 09:23:50.658764 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kr8w\" (UniqueName: \"kubernetes.io/projected/0bce9154-cd31-4c4a-9d86-2903d5b1adad-kube-api-access-4kr8w\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:50.672823 master-0 kubenswrapper[13205]: I0319 09:23:50.672731 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6dq\" (UniqueName: \"kubernetes.io/projected/43cb2a3b-40e2-45ee-894a-6c833ee17efd-kube-api-access-vf6dq\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7\" (UID: \"43cb2a3b-40e2-45ee-894a-6c833ee17efd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-np5d7" Mar 19 09:23:50.694074 master-0 kubenswrapper[13205]: I0319 09:23:50.694031 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7ppn\" (UniqueName: \"kubernetes.io/projected/5a51c701-7f2a-4332-a301-746e8a0eb475-kube-api-access-g7ppn\") pod \"apiserver-57c47bdf6-d9h47\" (UID: \"5a51c701-7f2a-4332-a301-746e8a0eb475\") " pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:50.712710 master-0 kubenswrapper[13205]: I0319 09:23:50.712475 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg4cn\" (UniqueName: \"kubernetes.io/projected/56e11aac-d199-404a-a0e2-82c28926746d-kube-api-access-pg4cn\") pod \"migrator-8487694857-g9497\" (UID: \"56e11aac-d199-404a-a0e2-82c28926746d\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-g9497" Mar 19 09:23:50.733378 master-0 kubenswrapper[13205]: I0319 09:23:50.733333 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hxq7\" (UniqueName: \"kubernetes.io/projected/bf5dde46-8a95-46a6-bee5-20d3a58f33ee-kube-api-access-6hxq7\") pod \"redhat-operators-4gs4g\" (UID: \"bf5dde46-8a95-46a6-bee5-20d3a58f33ee\") " pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:23:50.783339 master-0 kubenswrapper[13205]: I0319 09:23:50.783291 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rqsq\" (UniqueName: \"kubernetes.io/projected/3b50118d-f7c2-4bff-aca0-5c6623819baf-kube-api-access-6rqsq\") pod \"cluster-olm-operator-67dcd4998-p9czl\" (UID: \"3b50118d-f7c2-4bff-aca0-5c6623819baf\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-p9czl" Mar 19 09:23:50.784065 master-0 kubenswrapper[13205]: I0319 09:23:50.783952 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhzsr\" (UniqueName: \"kubernetes.io/projected/157e3524-eb27-41ca-b49d-2697ee1245ca-kube-api-access-qhzsr\") pod \"multus-bzdzd\" (UID: \"157e3524-eb27-41ca-b49d-2697ee1245ca\") " pod="openshift-multus/multus-bzdzd" Mar 19 09:23:50.791974 master-0 kubenswrapper[13205]: I0319 09:23:50.791918 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49fpz\" (UniqueName: \"kubernetes.io/projected/1d5e311c-1c6a-4d5d-8c2b-493025593934-kube-api-access-49fpz\") pod \"controller-manager-6c8fd866bf-g46sj\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:23:50.815335 master-0 kubenswrapper[13205]: I0319 09:23:50.815238 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hnvh\" (UniqueName: \"kubernetes.io/projected/4abcf2ea-50f5-4d62-8a23-583438e5b451-kube-api-access-2hnvh\") pod \"network-operator-7bd846bfc4-b4d28\" (UID: \"4abcf2ea-50f5-4d62-8a23-583438e5b451\") " pod="openshift-network-operator/network-operator-7bd846bfc4-b4d28" Mar 19 09:23:50.847380 master-0 kubenswrapper[13205]: I0319 09:23:50.847323 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c247d991-809e-46b6-9617-9b05007b7560-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5m8t6\" (UID: \"c247d991-809e-46b6-9617-9b05007b7560\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5m8t6" Mar 19 09:23:50.853240 master-0 kubenswrapper[13205]: I0319 09:23:50.853118 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxjqg\" (UniqueName: \"kubernetes.io/projected/979d4d12-a560-4309-a1d3-cbebe853e8ea-kube-api-access-rxjqg\") pod \"multus-additional-cni-plugins-8kv6s\" (UID: \"979d4d12-a560-4309-a1d3-cbebe853e8ea\") " pod="openshift-multus/multus-additional-cni-plugins-8kv6s" Mar 19 09:23:50.876786 master-0 kubenswrapper[13205]: I0319 09:23:50.876753 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sclqq\" (UniqueName: \"kubernetes.io/projected/67d66357-fcee-4e70-b563-5895b978ab55-kube-api-access-sclqq\") pod \"route-controller-manager-8555fbf585-9ggfr\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:23:50.888036 master-0 kubenswrapper[13205]: I0319 09:23:50.887976 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:50.896188 master-0 kubenswrapper[13205]: I0319 09:23:50.896151 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6j2m\" (UniqueName: \"kubernetes.io/projected/4d2c5580-36f6-4107-af53-cfbd15080b30-kube-api-access-x6j2m\") pod \"community-operators-2ct9k\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " pod="openshift-marketplace/community-operators-2ct9k" Mar 19 09:23:50.935920 master-0 kubenswrapper[13205]: I0319 09:23:50.935790 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:50.935920 master-0 kubenswrapper[13205]: I0319 09:23:50.935908 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:50.938248 master-0 kubenswrapper[13205]: I0319 09:23:50.938201 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:23:50.942658 master-0 kubenswrapper[13205]: I0319 09:23:50.941483 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:23:50.943297 master-0 kubenswrapper[13205]: I0319 09:23:50.943277 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:23:50.944832 master-0 kubenswrapper[13205]: I0319 09:23:50.944764 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:23:50.945777 master-0 kubenswrapper[13205]: I0319 09:23:50.945727 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:23:50.961710 master-0 kubenswrapper[13205]: I0319 09:23:50.961656 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6j2m\" (UniqueName: \"kubernetes.io/projected/4d2c5580-36f6-4107-af53-cfbd15080b30-kube-api-access-x6j2m\") pod \"4d2c5580-36f6-4107-af53-cfbd15080b30\" (UID: \"4d2c5580-36f6-4107-af53-cfbd15080b30\") " Mar 19 09:23:50.967120 master-0 kubenswrapper[13205]: I0319 09:23:50.966951 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d2c5580-36f6-4107-af53-cfbd15080b30-kube-api-access-x6j2m" (OuterVolumeSpecName: "kube-api-access-x6j2m") pod "4d2c5580-36f6-4107-af53-cfbd15080b30" (UID: "4d2c5580-36f6-4107-af53-cfbd15080b30"). InnerVolumeSpecName "kube-api-access-x6j2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:23:51.060245 master-0 kubenswrapper[13205]: I0319 09:23:51.060135 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=21.060111476 podStartE2EDuration="21.060111476s" podCreationTimestamp="2026-03-19 09:23:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:23:51.059618404 +0000 UTC m=+16.391925302" watchObservedRunningTime="2026-03-19 09:23:51.060111476 +0000 UTC m=+16.392418364" Mar 19 09:23:51.062860 master-0 kubenswrapper[13205]: I0319 09:23:51.062744 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6j2m\" (UniqueName: \"kubernetes.io/projected/4d2c5580-36f6-4107-af53-cfbd15080b30-kube-api-access-x6j2m\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:51.164670 master-0 kubenswrapper[13205]: I0319 09:23:51.164592 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4s5vc" event={"ID":"10c609bb-136a-4ce2-b9e2-0a03e1a37a62","Type":"ContainerStarted","Data":"cedf00264b64f8165806588371fffc6854d86c7cd95031f4df546b377605897a"} Mar 19 09:23:51.164670 master-0 kubenswrapper[13205]: I0319 09:23:51.164668 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4s5vc" event={"ID":"10c609bb-136a-4ce2-b9e2-0a03e1a37a62","Type":"ContainerStarted","Data":"2a28c272218ceebe3247b72677c55db596dadef4d7baae0def309252cd35d5e6"} Mar 19 09:23:51.165573 master-0 kubenswrapper[13205]: I0319 09:23:51.164922 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:23:51.168887 master-0 kubenswrapper[13205]: I0319 09:23:51.168838 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-995hm" event={"ID":"c5966fa8-b9f0-42ee-a75b-20014782366d","Type":"ContainerStarted","Data":"76d77deffe2a557a141b77b7bce38f16259abe3c8e83a830f0bc306843c81879"} Mar 19 09:23:51.171481 master-0 kubenswrapper[13205]: I0319 09:23:51.171448 13205 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:23:51.500903 master-0 kubenswrapper[13205]: I0319 09:23:51.500834 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=21.500819923999998 podStartE2EDuration="21.500819924s" podCreationTimestamp="2026-03-19 09:23:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:23:51.500041746 +0000 UTC m=+16.832348634" watchObservedRunningTime="2026-03-19 09:23:51.500819924 +0000 UTC m=+16.833126812" Mar 19 09:23:52.175926 master-0 kubenswrapper[13205]: I0319 09:23:52.175595 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wfkb9" event={"ID":"ba059ee7-0b57-442b-b6c4-b1e6cb3fd0f2","Type":"ContainerStarted","Data":"2e4bcb1fd4341843a18ec2e2e2968354b37d840eddfaae7afdcf5162f81cd02b"} Mar 19 09:23:52.177987 master-0 kubenswrapper[13205]: I0319 09:23:52.177943 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gs4g" event={"ID":"bf5dde46-8a95-46a6-bee5-20d3a58f33ee","Type":"ContainerStarted","Data":"67bef670f2fc3b00c9bbcef0f4ee56356b2838e91235d4b6097a31ee1def05f9"} Mar 19 09:23:52.180249 master-0 kubenswrapper[13205]: I0319 09:23:52.180184 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr42z" event={"ID":"741c9d25-7634-41c0-bfe4-b7a15de4b341","Type":"ContainerStarted","Data":"3b7e1024d36f3606d4f4badcb873e6c968e805824182a86614d6d124c09d2758"} Mar 19 09:23:52.203102 master-0 kubenswrapper[13205]: I0319 09:23:52.203050 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:52.208303 master-0 kubenswrapper[13205]: I0319 09:23:52.208259 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:52.281800 master-0 kubenswrapper[13205]: I0319 09:23:52.280294 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:23:53.189604 master-0 kubenswrapper[13205]: I0319 09:23:53.189518 13205 generic.go:334] "Generic (PLEG): container finished" podID="741c9d25-7634-41c0-bfe4-b7a15de4b341" containerID="3b7e1024d36f3606d4f4badcb873e6c968e805824182a86614d6d124c09d2758" exitCode=0 Mar 19 09:23:53.192967 master-0 kubenswrapper[13205]: I0319 09:23:53.189600 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr42z" event={"ID":"741c9d25-7634-41c0-bfe4-b7a15de4b341","Type":"ContainerDied","Data":"3b7e1024d36f3606d4f4badcb873e6c968e805824182a86614d6d124c09d2758"} Mar 19 09:23:53.194428 master-0 kubenswrapper[13205]: I0319 09:23:53.194357 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:23:53.631785 master-0 kubenswrapper[13205]: I0319 09:23:53.631375 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 19 09:23:53.652084 master-0 kubenswrapper[13205]: I0319 09:23:53.652025 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 19 09:23:55.945226 master-0 kubenswrapper[13205]: I0319 09:23:55.944942 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-7dcf67dd86-6hgld" Mar 19 09:23:55.947573 master-0 kubenswrapper[13205]: I0319 09:23:55.947519 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-57c47bdf6-d9h47" Mar 19 09:24:00.341283 master-0 kubenswrapper[13205]: I0319 09:24:00.341229 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:24:00.342007 master-0 kubenswrapper[13205]: I0319 09:24:00.341399 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:24:00.393189 master-0 kubenswrapper[13205]: I0319 09:24:00.393142 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:24:00.641016 master-0 kubenswrapper[13205]: I0319 09:24:00.640507 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:24:00.641016 master-0 kubenswrapper[13205]: I0319 09:24:00.640948 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:24:00.697243 master-0 kubenswrapper[13205]: I0319 09:24:00.697179 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:24:00.939686 master-0 kubenswrapper[13205]: I0319 09:24:00.939473 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:24:00.939686 master-0 kubenswrapper[13205]: I0319 09:24:00.939537 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:24:00.974650 master-0 kubenswrapper[13205]: I0319 09:24:00.974554 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:24:01.264633 master-0 kubenswrapper[13205]: I0319 09:24:01.264456 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-995hm" Mar 19 09:24:01.265069 master-0 kubenswrapper[13205]: I0319 09:24:01.265034 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wfkb9" Mar 19 09:24:01.291308 master-0 kubenswrapper[13205]: I0319 09:24:01.291234 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4gs4g" Mar 19 09:24:02.241223 master-0 kubenswrapper[13205]: I0319 09:24:02.241103 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr42z" event={"ID":"741c9d25-7634-41c0-bfe4-b7a15de4b341","Type":"ContainerStarted","Data":"43fc5b5a23c862d8d00a07486dcd00c1031ce25a6e6b63e869e3538646945cdf"} Mar 19 09:24:02.514782 master-0 kubenswrapper[13205]: I0319 09:24:02.514707 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:02.982833 master-0 kubenswrapper[13205]: I0319 09:24:02.982537 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2ct9k"] Mar 19 09:24:06.600818 master-0 kubenswrapper[13205]: I0319 09:24:06.600473 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2ct9k"] Mar 19 09:24:06.858860 master-0 kubenswrapper[13205]: I0319 09:24:06.858678 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d2c5580-36f6-4107-af53-cfbd15080b30" path="/var/lib/kubelet/pods/4d2c5580-36f6-4107-af53-cfbd15080b30/volumes" Mar 19 09:24:10.940824 master-0 kubenswrapper[13205]: I0319 09:24:10.940356 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:24:10.942557 master-0 kubenswrapper[13205]: I0319 09:24:10.940892 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:24:11.004337 master-0 kubenswrapper[13205]: I0319 09:24:11.004278 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:24:11.324212 master-0 kubenswrapper[13205]: I0319 09:24:11.324169 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xr42z" Mar 19 09:24:17.139343 master-0 kubenswrapper[13205]: I0319 09:24:17.139063 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7dmw4"] Mar 19 09:24:17.416250 master-0 kubenswrapper[13205]: I0319 09:24:17.416143 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7dmw4"] Mar 19 09:24:18.395273 master-0 kubenswrapper[13205]: I0319 09:24:18.395220 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:24:18.396046 master-0 kubenswrapper[13205]: I0319 09:24:18.395357 13205 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:24:18.425690 master-0 kubenswrapper[13205]: I0319 09:24:18.425619 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vcxjs" Mar 19 09:24:18.862569 master-0 kubenswrapper[13205]: I0319 09:24:18.862442 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bce9154-cd31-4c4a-9d86-2903d5b1adad" path="/var/lib/kubelet/pods/0bce9154-cd31-4c4a-9d86-2903d5b1adad/volumes" Mar 19 09:24:20.339888 master-0 kubenswrapper[13205]: I0319 09:24:20.339556 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4s5vc" Mar 19 09:24:21.013450 master-0 kubenswrapper[13205]: I0319 09:24:21.013385 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-79jrh"] Mar 19 09:24:21.013655 master-0 kubenswrapper[13205]: E0319 09:24:21.013605 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce9154-cd31-4c4a-9d86-2903d5b1adad" containerName="extract-utilities" Mar 19 09:24:21.013655 master-0 kubenswrapper[13205]: I0319 09:24:21.013619 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce9154-cd31-4c4a-9d86-2903d5b1adad" containerName="extract-utilities" Mar 19 09:24:21.013655 master-0 kubenswrapper[13205]: E0319 09:24:21.013637 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" containerName="assisted-installer-controller" Mar 19 09:24:21.013655 master-0 kubenswrapper[13205]: I0319 09:24:21.013646 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" containerName="assisted-installer-controller" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: E0319 09:24:21.013658 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ca4232-9e9c-4b97-9c29-bead80a9a5fa" containerName="installer" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: I0319 09:24:21.013667 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ca4232-9e9c-4b97-9c29-bead80a9a5fa" containerName="installer" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: E0319 09:24:21.013677 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce9154-cd31-4c4a-9d86-2903d5b1adad" containerName="extract-content" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: I0319 09:24:21.013683 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce9154-cd31-4c4a-9d86-2903d5b1adad" containerName="extract-content" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: E0319 09:24:21.013694 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a" containerName="installer" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: I0319 09:24:21.013700 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a" containerName="installer" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: E0319 09:24:21.013713 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: I0319 09:24:21.013721 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: E0319 09:24:21.013741 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2c5580-36f6-4107-af53-cfbd15080b30" containerName="extract-content" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: I0319 09:24:21.013748 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2c5580-36f6-4107-af53-cfbd15080b30" containerName="extract-content" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: E0319 09:24:21.013762 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2c5580-36f6-4107-af53-cfbd15080b30" containerName="extract-utilities" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: I0319 09:24:21.013770 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2c5580-36f6-4107-af53-cfbd15080b30" containerName="extract-utilities" Mar 19 09:24:21.013773 master-0 kubenswrapper[13205]: E0319 09:24:21.013780 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff98fb1e-7a1f-4657-b085-743d6f2d28e2" containerName="installer" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.013786 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff98fb1e-7a1f-4657-b085-743d6f2d28e2" containerName="installer" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: E0319 09:24:21.013798 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014ef8bd-b940-41e2-9239-c238afe6ebae" containerName="installer" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.013807 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="014ef8bd-b940-41e2-9239-c238afe6ebae" containerName="installer" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: E0319 09:24:21.013827 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434aabfa-50db-407e-92d3-a034696613e3" containerName="installer" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.013835 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="434aabfa-50db-407e-92d3-a034696613e3" containerName="installer" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: E0319 09:24:21.013845 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" containerName="prober" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.013853 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" containerName="prober" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.013969 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="434aabfa-50db-407e-92d3-a034696613e3" containerName="installer" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.013983 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d2c5580-36f6-4107-af53-cfbd15080b30" containerName="extract-content" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.013992 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.014003 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d2c5580-36f6-4107-af53-cfbd15080b30" containerName="extract-utilities" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.014019 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a" containerName="installer" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.014030 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="014ef8bd-b940-41e2-9239-c238afe6ebae" containerName="installer" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.014048 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff98fb1e-7a1f-4657-b085-743d6f2d28e2" containerName="installer" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.014062 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bce9154-cd31-4c4a-9d86-2903d5b1adad" containerName="extract-utilities" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.014077 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ebf851a-172c-4f6d-9b72-9ae8afa5e950" containerName="prober" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.014090 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9" containerName="assisted-installer-controller" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.014100 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ca4232-9e9c-4b97-9c29-bead80a9a5fa" containerName="installer" Mar 19 09:24:21.014280 master-0 kubenswrapper[13205]: I0319 09:24:21.014109 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bce9154-cd31-4c4a-9d86-2903d5b1adad" containerName="extract-content" Mar 19 09:24:21.014964 master-0 kubenswrapper[13205]: I0319 09:24:21.014678 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:21.016026 master-0 kubenswrapper[13205]: I0319 09:24:21.015992 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-rb955"] Mar 19 09:24:21.016784 master-0 kubenswrapper[13205]: I0319 09:24:21.016759 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.019708 master-0 kubenswrapper[13205]: I0319 09:24:21.019685 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"tuned-dockercfg-czcgc" Mar 19 09:24:21.019820 master-0 kubenswrapper[13205]: I0319 09:24:21.019781 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 09:24:21.019967 master-0 kubenswrapper[13205]: I0319 09:24:21.019919 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 09:24:21.020047 master-0 kubenswrapper[13205]: I0319 09:24:21.020016 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 09:24:21.036051 master-0 kubenswrapper[13205]: I0319 09:24:21.034067 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 09:24:21.213650 master-0 kubenswrapper[13205]: I0319 09:24:21.213596 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-run\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.213650 master-0 kubenswrapper[13205]: I0319 09:24:21.213647 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-867pw\" (UniqueName: \"kubernetes.io/projected/fce9ea11-1498-4ef6-ba71-d125c193159c-kube-api-access-867pw\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213678 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-host\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213707 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-lib-modules\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213726 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213754 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-sys\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213771 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-var-lib-kubelet\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213791 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-tuned\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213809 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fce9ea11-1498-4ef6-ba71-d125c193159c-tmp\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213834 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-kubernetes\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213897 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/745093e5-ffe1-4443-b317-448948f3b311-config-volume\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213924 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-modprobe-d\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213944 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-sysctl-d\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213970 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cvtc\" (UniqueName: \"kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.213989 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-sysctl-conf\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.213996 master-0 kubenswrapper[13205]: I0319 09:24:21.214013 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-sysconfig\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.214960 master-0 kubenswrapper[13205]: I0319 09:24:21.214031 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-systemd\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.314974 master-0 kubenswrapper[13205]: I0319 09:24:21.314849 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-sysconfig\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.314974 master-0 kubenswrapper[13205]: I0319 09:24:21.314953 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-systemd\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.315421 master-0 kubenswrapper[13205]: I0319 09:24:21.315005 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-run\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.315421 master-0 kubenswrapper[13205]: I0319 09:24:21.315009 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-sysconfig\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.315421 master-0 kubenswrapper[13205]: I0319 09:24:21.315042 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-867pw\" (UniqueName: \"kubernetes.io/projected/fce9ea11-1498-4ef6-ba71-d125c193159c-kube-api-access-867pw\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.315421 master-0 kubenswrapper[13205]: I0319 09:24:21.315232 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-systemd\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.315923 master-0 kubenswrapper[13205]: I0319 09:24:21.315488 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-run\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.315923 master-0 kubenswrapper[13205]: I0319 09:24:21.315501 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-host\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.315923 master-0 kubenswrapper[13205]: I0319 09:24:21.315601 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-host\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.315923 master-0 kubenswrapper[13205]: I0319 09:24:21.315659 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-lib-modules\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.315923 master-0 kubenswrapper[13205]: I0319 09:24:21.315721 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:21.315923 master-0 kubenswrapper[13205]: I0319 09:24:21.315784 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-sys\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.315923 master-0 kubenswrapper[13205]: I0319 09:24:21.315910 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-sys\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.315923 master-0 kubenswrapper[13205]: I0319 09:24:21.315930 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-var-lib-kubelet\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316007 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-var-lib-kubelet\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: E0319 09:24:21.316059 13205 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316136 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-tuned\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: E0319 09:24:21.316163 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls podName:745093e5-ffe1-4443-b317-448948f3b311 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:21.816135908 +0000 UTC m=+47.148442826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls") pod "dns-default-79jrh" (UID: "745093e5-ffe1-4443-b317-448948f3b311") : secret "dns-default-metrics-tls" not found Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316170 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-lib-modules\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316191 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fce9ea11-1498-4ef6-ba71-d125c193159c-tmp\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316250 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-kubernetes\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316356 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/745093e5-ffe1-4443-b317-448948f3b311-config-volume\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316392 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-modprobe-d\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316459 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-kubernetes\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316497 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-sysctl-d\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316594 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cvtc\" (UniqueName: \"kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316605 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-modprobe-d\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316627 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-sysctl-conf\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.316798 master-0 kubenswrapper[13205]: I0319 09:24:21.316711 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-sysctl-d\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.318151 master-0 kubenswrapper[13205]: I0319 09:24:21.316952 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-sysctl-conf\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.318151 master-0 kubenswrapper[13205]: I0319 09:24:21.317688 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/745093e5-ffe1-4443-b317-448948f3b311-config-volume\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:21.318151 master-0 kubenswrapper[13205]: I0319 09:24:21.317715 13205 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 09:24:21.325502 master-0 kubenswrapper[13205]: I0319 09:24:21.325439 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fce9ea11-1498-4ef6-ba71-d125c193159c-tmp\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.326977 master-0 kubenswrapper[13205]: I0319 09:24:21.326942 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fce9ea11-1498-4ef6-ba71-d125c193159c-etc-tuned\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.422438 master-0 kubenswrapper[13205]: I0319 09:24:21.422348 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-79jrh"] Mar 19 09:24:21.689773 master-0 kubenswrapper[13205]: I0319 09:24:21.689598 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x"] Mar 19 09:24:21.690845 master-0 kubenswrapper[13205]: I0319 09:24:21.690798 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" Mar 19 09:24:21.700401 master-0 kubenswrapper[13205]: W0319 09:24:21.700334 13205 reflector.go:561] object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert": failed to list *v1.Secret: secrets "cloud-credential-operator-serving-cert" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-cloud-credential-operator": no relationship found between node 'master-0' and this object Mar 19 09:24:21.700727 master-0 kubenswrapper[13205]: E0319 09:24:21.700407 13205 reflector.go:158] "Unhandled Error" err="object-\"openshift-cloud-credential-operator\"/\"cloud-credential-operator-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cloud-credential-operator-serving-cert\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cloud-credential-operator\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:24:21.700727 master-0 kubenswrapper[13205]: W0319 09:24:21.700496 13205 reflector.go:561] object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-xh8t6": failed to list *v1.Secret: secrets "cloud-credential-operator-dockercfg-xh8t6" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-cloud-credential-operator": no relationship found between node 'master-0' and this object Mar 19 09:24:21.700727 master-0 kubenswrapper[13205]: E0319 09:24:21.700519 13205 reflector.go:158] "Unhandled Error" err="object-\"openshift-cloud-credential-operator\"/\"cloud-credential-operator-dockercfg-xh8t6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cloud-credential-operator-dockercfg-xh8t6\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cloud-credential-operator\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:24:21.700727 master-0 kubenswrapper[13205]: W0319 09:24:21.700620 13205 reflector.go:561] object-"openshift-cloud-credential-operator"/"cco-trusted-ca": failed to list *v1.ConfigMap: configmaps "cco-trusted-ca" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-cloud-credential-operator": no relationship found between node 'master-0' and this object Mar 19 09:24:21.700727 master-0 kubenswrapper[13205]: E0319 09:24:21.700644 13205 reflector.go:158] "Unhandled Error" err="object-\"openshift-cloud-credential-operator\"/\"cco-trusted-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cco-trusted-ca\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cloud-credential-operator\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:24:21.701599 master-0 kubenswrapper[13205]: W0319 09:24:21.701510 13205 reflector.go:561] object-"openshift-cloud-credential-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-cloud-credential-operator": no relationship found between node 'master-0' and this object Mar 19 09:24:21.701748 master-0 kubenswrapper[13205]: E0319 09:24:21.701602 13205 reflector.go:158] "Unhandled Error" err="object-\"openshift-cloud-credential-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cloud-credential-operator\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:24:21.701748 master-0 kubenswrapper[13205]: W0319 09:24:21.701644 13205 reflector.go:561] object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-cloud-credential-operator": no relationship found between node 'master-0' and this object Mar 19 09:24:21.701748 master-0 kubenswrapper[13205]: E0319 09:24:21.701667 13205 reflector.go:158] "Unhandled Error" err="object-\"openshift-cloud-credential-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cloud-credential-operator\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:24:21.711079 master-0 kubenswrapper[13205]: E0319 09:24:21.710763 13205 projected.go:288] Couldn't get configMap openshift-dns/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 19 09:24:21.711079 master-0 kubenswrapper[13205]: E0319 09:24:21.710815 13205 projected.go:194] Error preparing data for projected volume kube-api-access-5cvtc for pod openshift-dns/dns-default-79jrh: configmap "kube-root-ca.crt" not found Mar 19 09:24:21.711079 master-0 kubenswrapper[13205]: E0319 09:24:21.710892 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc podName:745093e5-ffe1-4443-b317-448948f3b311 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:22.210863622 +0000 UTC m=+47.543170550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5cvtc" (UniqueName: "kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc") pod "dns-default-79jrh" (UID: "745093e5-ffe1-4443-b317-448948f3b311") : configmap "kube-root-ca.crt" not found Mar 19 09:24:21.726483 master-0 kubenswrapper[13205]: I0319 09:24:21.726433 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-867pw\" (UniqueName: \"kubernetes.io/projected/fce9ea11-1498-4ef6-ba71-d125c193159c-kube-api-access-867pw\") pod \"tuned-rb955\" (UID: \"fce9ea11-1498-4ef6-ba71-d125c193159c\") " pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.734606 master-0 kubenswrapper[13205]: I0319 09:24:21.734509 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6775d7ec-8114-4fc3-a23d-d5ac910f3285-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-zgn8x\" (UID: \"6775d7ec-8114-4fc3-a23d-d5ac910f3285\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" Mar 19 09:24:21.734823 master-0 kubenswrapper[13205]: I0319 09:24:21.734628 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6775d7ec-8114-4fc3-a23d-d5ac910f3285-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-zgn8x\" (UID: \"6775d7ec-8114-4fc3-a23d-d5ac910f3285\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" Mar 19 09:24:21.734823 master-0 kubenswrapper[13205]: I0319 09:24:21.734681 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lqwx\" (UniqueName: \"kubernetes.io/projected/6775d7ec-8114-4fc3-a23d-d5ac910f3285-kube-api-access-7lqwx\") pod \"cloud-credential-operator-744f9dbf77-zgn8x\" (UID: \"6775d7ec-8114-4fc3-a23d-d5ac910f3285\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" Mar 19 09:24:21.835947 master-0 kubenswrapper[13205]: I0319 09:24:21.835723 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6775d7ec-8114-4fc3-a23d-d5ac910f3285-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-zgn8x\" (UID: \"6775d7ec-8114-4fc3-a23d-d5ac910f3285\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" Mar 19 09:24:21.835947 master-0 kubenswrapper[13205]: I0319 09:24:21.835814 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:21.835947 master-0 kubenswrapper[13205]: I0319 09:24:21.835855 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lqwx\" (UniqueName: \"kubernetes.io/projected/6775d7ec-8114-4fc3-a23d-d5ac910f3285-kube-api-access-7lqwx\") pod \"cloud-credential-operator-744f9dbf77-zgn8x\" (UID: \"6775d7ec-8114-4fc3-a23d-d5ac910f3285\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" Mar 19 09:24:21.835947 master-0 kubenswrapper[13205]: I0319 09:24:21.836075 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6775d7ec-8114-4fc3-a23d-d5ac910f3285-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-zgn8x\" (UID: \"6775d7ec-8114-4fc3-a23d-d5ac910f3285\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" Mar 19 09:24:21.836581 master-0 kubenswrapper[13205]: E0319 09:24:21.836146 13205 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 19 09:24:21.836581 master-0 kubenswrapper[13205]: E0319 09:24:21.836214 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls podName:745093e5-ffe1-4443-b317-448948f3b311 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:22.836194846 +0000 UTC m=+48.168501744 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls") pod "dns-default-79jrh" (UID: "745093e5-ffe1-4443-b317-448948f3b311") : secret "dns-default-metrics-tls" not found Mar 19 09:24:21.918665 master-0 kubenswrapper[13205]: I0319 09:24:21.918585 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x"] Mar 19 09:24:21.936671 master-0 kubenswrapper[13205]: I0319 09:24:21.935811 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j"] Mar 19 09:24:21.936955 master-0 kubenswrapper[13205]: I0319 09:24:21.936888 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7"] Mar 19 09:24:21.938352 master-0 kubenswrapper[13205]: I0319 09:24:21.938312 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j" Mar 19 09:24:21.938931 master-0 kubenswrapper[13205]: I0319 09:24:21.938888 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:21.939959 master-0 kubenswrapper[13205]: I0319 09:24:21.938923 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbsl4\" (UniqueName: \"kubernetes.io/projected/9e10cb6e-5703-4e4d-a82b-f6de34888b65-kube-api-access-bbsl4\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:21.940266 master-0 kubenswrapper[13205]: I0319 09:24:21.940223 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e10cb6e-5703-4e4d-a82b-f6de34888b65-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:21.940659 master-0 kubenswrapper[13205]: I0319 09:24:21.940613 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpqn7\" (UniqueName: \"kubernetes.io/projected/0e517f77-ec32-4376-b1ff-88ec24a22e3e-kube-api-access-rpqn7\") pod \"cluster-samples-operator-85f7577d78-tkj2j\" (UID: \"0e517f77-ec32-4376-b1ff-88ec24a22e3e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j" Mar 19 09:24:21.940923 master-0 kubenswrapper[13205]: I0319 09:24:21.940885 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9e10cb6e-5703-4e4d-a82b-f6de34888b65-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:21.941160 master-0 kubenswrapper[13205]: I0319 09:24:21.941125 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/9e10cb6e-5703-4e4d-a82b-f6de34888b65-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:21.942551 master-0 kubenswrapper[13205]: I0319 09:24:21.942479 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e517f77-ec32-4376-b1ff-88ec24a22e3e-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-tkj2j\" (UID: \"0e517f77-ec32-4376-b1ff-88ec24a22e3e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j" Mar 19 09:24:21.942846 master-0 kubenswrapper[13205]: I0319 09:24:21.942257 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89"] Mar 19 09:24:21.943332 master-0 kubenswrapper[13205]: I0319 09:24:21.942936 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 09:24:21.949699 master-0 kubenswrapper[13205]: I0319 09:24:21.943787 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 09:24:21.949699 master-0 kubenswrapper[13205]: I0319 09:24:21.942905 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9e10cb6e-5703-4e4d-a82b-f6de34888b65-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:21.949699 master-0 kubenswrapper[13205]: I0319 09:24:21.943028 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-s8fs4" Mar 19 09:24:21.949699 master-0 kubenswrapper[13205]: I0319 09:24:21.943111 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:24:21.949699 master-0 kubenswrapper[13205]: I0319 09:24:21.943146 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 09:24:21.949699 master-0 kubenswrapper[13205]: I0319 09:24:21.943231 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 09:24:21.949699 master-0 kubenswrapper[13205]: I0319 09:24:21.945745 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 09:24:21.949699 master-0 kubenswrapper[13205]: I0319 09:24:21.945913 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 09:24:21.949699 master-0 kubenswrapper[13205]: I0319 09:24:21.946135 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-b9dtc" Mar 19 09:24:21.949699 master-0 kubenswrapper[13205]: I0319 09:24:21.949441 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:24:21.955624 master-0 kubenswrapper[13205]: I0319 09:24:21.955580 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq"] Mar 19 09:24:21.955913 master-0 kubenswrapper[13205]: I0319 09:24:21.955701 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:21.956014 master-0 kubenswrapper[13205]: I0319 09:24:21.955959 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rb955" Mar 19 09:24:21.959299 master-0 kubenswrapper[13205]: I0319 09:24:21.957668 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 09:24:21.959299 master-0 kubenswrapper[13205]: I0319 09:24:21.959044 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 09:24:21.959299 master-0 kubenswrapper[13205]: I0319 09:24:21.959070 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 09:24:21.959299 master-0 kubenswrapper[13205]: I0319 09:24:21.959259 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-xnqzt" Mar 19 09:24:21.975725 master-0 kubenswrapper[13205]: I0319 09:24:21.959401 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 09:24:21.975725 master-0 kubenswrapper[13205]: I0319 09:24:21.959516 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 09:24:21.982905 master-0 kubenswrapper[13205]: I0319 09:24:21.977045 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq" Mar 19 09:24:21.982905 master-0 kubenswrapper[13205]: I0319 09:24:21.978960 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-gpc6r" Mar 19 09:24:21.982905 master-0 kubenswrapper[13205]: I0319 09:24:21.979365 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 09:24:22.002472 master-0 kubenswrapper[13205]: I0319 09:24:22.002426 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-prd2q"] Mar 19 09:24:22.002706 master-0 kubenswrapper[13205]: I0319 09:24:22.002486 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j"] Mar 19 09:24:22.002758 master-0 kubenswrapper[13205]: I0319 09:24:22.002698 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" podUID="51b88818-5108-40db-90c8-4f2e7198959e" containerName="cluster-version-operator" containerID="cri-o://caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9" gracePeriod=130 Mar 19 09:24:22.045049 master-0 kubenswrapper[13205]: I0319 09:24:22.044417 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e517f77-ec32-4376-b1ff-88ec24a22e3e-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-tkj2j\" (UID: \"0e517f77-ec32-4376-b1ff-88ec24a22e3e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j" Mar 19 09:24:22.045049 master-0 kubenswrapper[13205]: I0319 09:24:22.044678 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9e10cb6e-5703-4e4d-a82b-f6de34888b65-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:22.045049 master-0 kubenswrapper[13205]: I0319 09:24:22.044701 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbsl4\" (UniqueName: \"kubernetes.io/projected/9e10cb6e-5703-4e4d-a82b-f6de34888b65-kube-api-access-bbsl4\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:22.045049 master-0 kubenswrapper[13205]: I0319 09:24:22.044729 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e10cb6e-5703-4e4d-a82b-f6de34888b65-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:22.045049 master-0 kubenswrapper[13205]: I0319 09:24:22.044903 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnsbc\" (UniqueName: \"kubernetes.io/projected/e8ca673b-2a2f-4ecf-a142-7fe10fcac707-kube-api-access-fnsbc\") pod \"machine-approver-5c6485487f-wdh89\" (UID: \"e8ca673b-2a2f-4ecf-a142-7fe10fcac707\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:22.045049 master-0 kubenswrapper[13205]: I0319 09:24:22.044942 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e8ca673b-2a2f-4ecf-a142-7fe10fcac707-machine-approver-tls\") pod \"machine-approver-5c6485487f-wdh89\" (UID: \"e8ca673b-2a2f-4ecf-a142-7fe10fcac707\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:22.045049 master-0 kubenswrapper[13205]: I0319 09:24:22.044979 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff50023c-0f3f-4506-b26f-9872d0eec45e-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-n7hxq\" (UID: \"ff50023c-0f3f-4506-b26f-9872d0eec45e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq" Mar 19 09:24:22.045049 master-0 kubenswrapper[13205]: I0319 09:24:22.045034 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8ca673b-2a2f-4ecf-a142-7fe10fcac707-auth-proxy-config\") pod \"machine-approver-5c6485487f-wdh89\" (UID: \"e8ca673b-2a2f-4ecf-a142-7fe10fcac707\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:22.045668 master-0 kubenswrapper[13205]: I0319 09:24:22.045091 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpqn7\" (UniqueName: \"kubernetes.io/projected/0e517f77-ec32-4376-b1ff-88ec24a22e3e-kube-api-access-rpqn7\") pod \"cluster-samples-operator-85f7577d78-tkj2j\" (UID: \"0e517f77-ec32-4376-b1ff-88ec24a22e3e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j" Mar 19 09:24:22.045668 master-0 kubenswrapper[13205]: I0319 09:24:22.045131 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9e10cb6e-5703-4e4d-a82b-f6de34888b65-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:22.045668 master-0 kubenswrapper[13205]: I0319 09:24:22.045166 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvr9x\" (UniqueName: \"kubernetes.io/projected/ff50023c-0f3f-4506-b26f-9872d0eec45e-kube-api-access-cvr9x\") pod \"cluster-storage-operator-7d87854d6-n7hxq\" (UID: \"ff50023c-0f3f-4506-b26f-9872d0eec45e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq" Mar 19 09:24:22.045668 master-0 kubenswrapper[13205]: I0319 09:24:22.045199 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8ca673b-2a2f-4ecf-a142-7fe10fcac707-config\") pod \"machine-approver-5c6485487f-wdh89\" (UID: \"e8ca673b-2a2f-4ecf-a142-7fe10fcac707\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:22.045668 master-0 kubenswrapper[13205]: I0319 09:24:22.045223 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/9e10cb6e-5703-4e4d-a82b-f6de34888b65-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:22.045668 master-0 kubenswrapper[13205]: I0319 09:24:22.045373 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/9e10cb6e-5703-4e4d-a82b-f6de34888b65-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:22.047085 master-0 kubenswrapper[13205]: I0319 09:24:22.047042 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9e10cb6e-5703-4e4d-a82b-f6de34888b65-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:22.047292 master-0 kubenswrapper[13205]: I0319 09:24:22.047250 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9e10cb6e-5703-4e4d-a82b-f6de34888b65-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:22.048932 master-0 kubenswrapper[13205]: I0319 09:24:22.048884 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e10cb6e-5703-4e4d-a82b-f6de34888b65-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:22.050651 master-0 kubenswrapper[13205]: I0319 09:24:22.050614 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e517f77-ec32-4376-b1ff-88ec24a22e3e-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-tkj2j\" (UID: \"0e517f77-ec32-4376-b1ff-88ec24a22e3e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j" Mar 19 09:24:22.146585 master-0 kubenswrapper[13205]: I0319 09:24:22.146490 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnsbc\" (UniqueName: \"kubernetes.io/projected/e8ca673b-2a2f-4ecf-a142-7fe10fcac707-kube-api-access-fnsbc\") pod \"machine-approver-5c6485487f-wdh89\" (UID: \"e8ca673b-2a2f-4ecf-a142-7fe10fcac707\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:22.146585 master-0 kubenswrapper[13205]: I0319 09:24:22.146552 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e8ca673b-2a2f-4ecf-a142-7fe10fcac707-machine-approver-tls\") pod \"machine-approver-5c6485487f-wdh89\" (UID: \"e8ca673b-2a2f-4ecf-a142-7fe10fcac707\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:22.146585 master-0 kubenswrapper[13205]: I0319 09:24:22.146572 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff50023c-0f3f-4506-b26f-9872d0eec45e-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-n7hxq\" (UID: \"ff50023c-0f3f-4506-b26f-9872d0eec45e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq" Mar 19 09:24:22.146585 master-0 kubenswrapper[13205]: I0319 09:24:22.146591 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8ca673b-2a2f-4ecf-a142-7fe10fcac707-auth-proxy-config\") pod \"machine-approver-5c6485487f-wdh89\" (UID: \"e8ca673b-2a2f-4ecf-a142-7fe10fcac707\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:22.146935 master-0 kubenswrapper[13205]: I0319 09:24:22.146795 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvr9x\" (UniqueName: \"kubernetes.io/projected/ff50023c-0f3f-4506-b26f-9872d0eec45e-kube-api-access-cvr9x\") pod \"cluster-storage-operator-7d87854d6-n7hxq\" (UID: \"ff50023c-0f3f-4506-b26f-9872d0eec45e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq" Mar 19 09:24:22.146935 master-0 kubenswrapper[13205]: I0319 09:24:22.146839 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8ca673b-2a2f-4ecf-a142-7fe10fcac707-config\") pod \"machine-approver-5c6485487f-wdh89\" (UID: \"e8ca673b-2a2f-4ecf-a142-7fe10fcac707\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:22.147797 master-0 kubenswrapper[13205]: I0319 09:24:22.147624 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8ca673b-2a2f-4ecf-a142-7fe10fcac707-auth-proxy-config\") pod \"machine-approver-5c6485487f-wdh89\" (UID: \"e8ca673b-2a2f-4ecf-a142-7fe10fcac707\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:22.148937 master-0 kubenswrapper[13205]: I0319 09:24:22.148891 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8ca673b-2a2f-4ecf-a142-7fe10fcac707-config\") pod \"machine-approver-5c6485487f-wdh89\" (UID: \"e8ca673b-2a2f-4ecf-a142-7fe10fcac707\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:22.150228 master-0 kubenswrapper[13205]: I0319 09:24:22.149995 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff50023c-0f3f-4506-b26f-9872d0eec45e-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-n7hxq\" (UID: \"ff50023c-0f3f-4506-b26f-9872d0eec45e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq" Mar 19 09:24:22.153413 master-0 kubenswrapper[13205]: I0319 09:24:22.153356 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e8ca673b-2a2f-4ecf-a142-7fe10fcac707-machine-approver-tls\") pod \"machine-approver-5c6485487f-wdh89\" (UID: \"e8ca673b-2a2f-4ecf-a142-7fe10fcac707\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:22.247962 master-0 kubenswrapper[13205]: I0319 09:24:22.247507 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cvtc\" (UniqueName: \"kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:22.247962 master-0 kubenswrapper[13205]: E0319 09:24:22.247740 13205 projected.go:288] Couldn't get configMap openshift-dns/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 19 09:24:22.247962 master-0 kubenswrapper[13205]: E0319 09:24:22.247787 13205 projected.go:194] Error preparing data for projected volume kube-api-access-5cvtc for pod openshift-dns/dns-default-79jrh: configmap "kube-root-ca.crt" not found Mar 19 09:24:22.247962 master-0 kubenswrapper[13205]: E0319 09:24:22.247854 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc podName:745093e5-ffe1-4443-b317-448948f3b311 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:23.24783374 +0000 UTC m=+48.580140638 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5cvtc" (UniqueName: "kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc") pod "dns-default-79jrh" (UID: "745093e5-ffe1-4443-b317-448948f3b311") : configmap "kube-root-ca.crt" not found Mar 19 09:24:22.312014 master-0 kubenswrapper[13205]: I0319 09:24:22.311955 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq"] Mar 19 09:24:22.381340 master-0 kubenswrapper[13205]: I0319 09:24:22.381271 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rb955" event={"ID":"fce9ea11-1498-4ef6-ba71-d125c193159c","Type":"ContainerStarted","Data":"b5c74faeebcb81d39ec885ec20cdc74536c52d3ee542b9f8bff950e59550c3e9"} Mar 19 09:24:22.529627 master-0 kubenswrapper[13205]: I0319 09:24:22.529578 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 09:24:22.538056 master-0 kubenswrapper[13205]: I0319 09:24:22.538008 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6775d7ec-8114-4fc3-a23d-d5ac910f3285-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-zgn8x\" (UID: \"6775d7ec-8114-4fc3-a23d-d5ac910f3285\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" Mar 19 09:24:22.634417 master-0 kubenswrapper[13205]: I0319 09:24:22.634366 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-xh8t6" Mar 19 09:24:22.837168 master-0 kubenswrapper[13205]: E0319 09:24:22.837046 13205 secret.go:189] Couldn't get secret openshift-cloud-credential-operator/cloud-credential-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 09:24:22.837577 master-0 kubenswrapper[13205]: E0319 09:24:22.837550 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6775d7ec-8114-4fc3-a23d-d5ac910f3285-cloud-credential-operator-serving-cert podName:6775d7ec-8114-4fc3-a23d-d5ac910f3285 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:23.337498053 +0000 UTC m=+48.669804961 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-credential-operator-serving-cert" (UniqueName: "kubernetes.io/secret/6775d7ec-8114-4fc3-a23d-d5ac910f3285-cloud-credential-operator-serving-cert") pod "cloud-credential-operator-744f9dbf77-zgn8x" (UID: "6775d7ec-8114-4fc3-a23d-d5ac910f3285") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:24:22.855043 master-0 kubenswrapper[13205]: I0319 09:24:22.854975 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:22.855291 master-0 kubenswrapper[13205]: E0319 09:24:22.855191 13205 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 19 09:24:22.855291 master-0 kubenswrapper[13205]: E0319 09:24:22.855252 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls podName:745093e5-ffe1-4443-b317-448948f3b311 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:24.855234623 +0000 UTC m=+50.187541511 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls") pod "dns-default-79jrh" (UID: "745093e5-ffe1-4443-b317-448948f3b311") : secret "dns-default-metrics-tls" not found Mar 19 09:24:22.901577 master-0 kubenswrapper[13205]: I0319 09:24:22.901502 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 09:24:23.011663 master-0 kubenswrapper[13205]: I0319 09:24:23.011618 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 09:24:23.145749 master-0 kubenswrapper[13205]: I0319 09:24:23.135084 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvr9x\" (UniqueName: \"kubernetes.io/projected/ff50023c-0f3f-4506-b26f-9872d0eec45e-kube-api-access-cvr9x\") pod \"cluster-storage-operator-7d87854d6-n7hxq\" (UID: \"ff50023c-0f3f-4506-b26f-9872d0eec45e\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq" Mar 19 09:24:23.145749 master-0 kubenswrapper[13205]: I0319 09:24:23.140112 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnsbc\" (UniqueName: \"kubernetes.io/projected/e8ca673b-2a2f-4ecf-a142-7fe10fcac707-kube-api-access-fnsbc\") pod \"machine-approver-5c6485487f-wdh89\" (UID: \"e8ca673b-2a2f-4ecf-a142-7fe10fcac707\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:23.154470 master-0 kubenswrapper[13205]: I0319 09:24:23.154417 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbsl4\" (UniqueName: \"kubernetes.io/projected/9e10cb6e-5703-4e4d-a82b-f6de34888b65-kube-api-access-bbsl4\") pod \"cluster-cloud-controller-manager-operator-7dff898856-r2cs7\" (UID: \"9e10cb6e-5703-4e4d-a82b-f6de34888b65\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:23.159641 master-0 kubenswrapper[13205]: I0319 09:24:23.159600 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpqn7\" (UniqueName: \"kubernetes.io/projected/0e517f77-ec32-4376-b1ff-88ec24a22e3e-kube-api-access-rpqn7\") pod \"cluster-samples-operator-85f7577d78-tkj2j\" (UID: \"0e517f77-ec32-4376-b1ff-88ec24a22e3e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j" Mar 19 09:24:23.178588 master-0 kubenswrapper[13205]: I0319 09:24:23.178111 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j" Mar 19 09:24:23.219970 master-0 kubenswrapper[13205]: I0319 09:24:23.219916 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" Mar 19 09:24:23.238455 master-0 kubenswrapper[13205]: W0319 09:24:23.238050 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e10cb6e_5703_4e4d_a82b_f6de34888b65.slice/crio-b5fc20fbb8c511bbc276f78352a02f67639aa9b3e321346dc845929306d32093 WatchSource:0}: Error finding container b5fc20fbb8c511bbc276f78352a02f67639aa9b3e321346dc845929306d32093: Status 404 returned error can't find the container with id b5fc20fbb8c511bbc276f78352a02f67639aa9b3e321346dc845929306d32093 Mar 19 09:24:23.240439 master-0 kubenswrapper[13205]: I0319 09:24:23.240401 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" Mar 19 09:24:23.260574 master-0 kubenswrapper[13205]: I0319 09:24:23.260008 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cvtc\" (UniqueName: \"kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:23.260574 master-0 kubenswrapper[13205]: E0319 09:24:23.260300 13205 projected.go:288] Couldn't get configMap openshift-dns/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 19 09:24:23.260574 master-0 kubenswrapper[13205]: E0319 09:24:23.260333 13205 projected.go:194] Error preparing data for projected volume kube-api-access-5cvtc for pod openshift-dns/dns-default-79jrh: configmap "kube-root-ca.crt" not found Mar 19 09:24:23.260574 master-0 kubenswrapper[13205]: E0319 09:24:23.260396 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc podName:745093e5-ffe1-4443-b317-448948f3b311 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:25.2603729 +0000 UTC m=+50.592679798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5cvtc" (UniqueName: "kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc") pod "dns-default-79jrh" (UID: "745093e5-ffe1-4443-b317-448948f3b311") : configmap "kube-root-ca.crt" not found Mar 19 09:24:23.266868 master-0 kubenswrapper[13205]: I0319 09:24:23.266841 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq" Mar 19 09:24:23.276776 master-0 kubenswrapper[13205]: I0319 09:24:23.276716 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 09:24:23.293733 master-0 kubenswrapper[13205]: I0319 09:24:23.293250 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lqwx\" (UniqueName: \"kubernetes.io/projected/6775d7ec-8114-4fc3-a23d-d5ac910f3285-kube-api-access-7lqwx\") pod \"cloud-credential-operator-744f9dbf77-zgn8x\" (UID: \"6775d7ec-8114-4fc3-a23d-d5ac910f3285\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" Mar 19 09:24:23.361723 master-0 kubenswrapper[13205]: I0319 09:24:23.361046 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6775d7ec-8114-4fc3-a23d-d5ac910f3285-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-zgn8x\" (UID: \"6775d7ec-8114-4fc3-a23d-d5ac910f3285\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" Mar 19 09:24:23.365042 master-0 kubenswrapper[13205]: I0319 09:24:23.364491 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6775d7ec-8114-4fc3-a23d-d5ac910f3285-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-zgn8x\" (UID: \"6775d7ec-8114-4fc3-a23d-d5ac910f3285\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" Mar 19 09:24:23.391890 master-0 kubenswrapper[13205]: I0319 09:24:23.391842 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rb955" event={"ID":"fce9ea11-1498-4ef6-ba71-d125c193159c","Type":"ContainerStarted","Data":"73d112edb06118e3fa2f31c7a7de8c542caf595c0d12a5f3a6f88dabb216451a"} Mar 19 09:24:23.393322 master-0 kubenswrapper[13205]: I0319 09:24:23.393296 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" event={"ID":"9e10cb6e-5703-4e4d-a82b-f6de34888b65","Type":"ContainerStarted","Data":"b5fc20fbb8c511bbc276f78352a02f67639aa9b3e321346dc845929306d32093"} Mar 19 09:24:23.394412 master-0 kubenswrapper[13205]: I0319 09:24:23.394378 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" event={"ID":"e8ca673b-2a2f-4ecf-a142-7fe10fcac707","Type":"ContainerStarted","Data":"175b6dc3e65f3e7daefcd513415b26580fc0d67cd43ba637dbc6fb34c3efef73"} Mar 19 09:24:23.492700 master-0 kubenswrapper[13205]: I0319 09:24:23.492514 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-4qq6m"] Mar 19 09:24:23.494446 master-0 kubenswrapper[13205]: I0319 09:24:23.494266 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj"] Mar 19 09:24:23.496613 master-0 kubenswrapper[13205]: I0319 09:24:23.494678 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.496613 master-0 kubenswrapper[13205]: I0319 09:24:23.495000 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:24:23.500581 master-0 kubenswrapper[13205]: I0319 09:24:23.499020 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 09:24:23.500581 master-0 kubenswrapper[13205]: I0319 09:24:23.499639 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 09:24:23.500581 master-0 kubenswrapper[13205]: I0319 09:24:23.499828 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 09:24:23.500581 master-0 kubenswrapper[13205]: I0319 09:24:23.499984 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 09:24:23.500581 master-0 kubenswrapper[13205]: I0319 09:24:23.500211 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 09:24:23.502334 master-0 kubenswrapper[13205]: I0319 09:24:23.500943 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-lsnll" Mar 19 09:24:23.502334 master-0 kubenswrapper[13205]: I0319 09:24:23.501213 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-d89qv" Mar 19 09:24:23.502334 master-0 kubenswrapper[13205]: I0319 09:24:23.501365 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 09:24:23.502334 master-0 kubenswrapper[13205]: I0319 09:24:23.501386 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 09:24:23.505303 master-0 kubenswrapper[13205]: I0319 09:24:23.505256 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 09:24:23.522494 master-0 kubenswrapper[13205]: I0319 09:24:23.522160 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" Mar 19 09:24:23.564066 master-0 kubenswrapper[13205]: I0319 09:24:23.563554 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7c70267e-b555-4d56-92e4-f24b65b61283-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-rfnfj\" (UID: \"7c70267e-b555-4d56-92e4-f24b65b61283\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:24:23.564066 master-0 kubenswrapper[13205]: I0319 09:24:23.563615 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c70267e-b555-4d56-92e4-f24b65b61283-serving-cert\") pod \"openshift-config-operator-95bf4f4d-rfnfj\" (UID: \"7c70267e-b555-4d56-92e4-f24b65b61283\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:24:23.564066 master-0 kubenswrapper[13205]: I0319 09:24:23.563703 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4869583f-43af-4ec9-8dea-1da1634816dc-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.564066 master-0 kubenswrapper[13205]: I0319 09:24:23.563735 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4869583f-43af-4ec9-8dea-1da1634816dc-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.564066 master-0 kubenswrapper[13205]: I0319 09:24:23.563771 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpt9m\" (UniqueName: \"kubernetes.io/projected/7c70267e-b555-4d56-92e4-f24b65b61283-kube-api-access-rpt9m\") pod \"openshift-config-operator-95bf4f4d-rfnfj\" (UID: \"7c70267e-b555-4d56-92e4-f24b65b61283\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:24:23.564066 master-0 kubenswrapper[13205]: I0319 09:24:23.563838 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4869583f-43af-4ec9-8dea-1da1634816dc-serving-cert\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.564066 master-0 kubenswrapper[13205]: I0319 09:24:23.563871 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st6dc\" (UniqueName: \"kubernetes.io/projected/4869583f-43af-4ec9-8dea-1da1634816dc-kube-api-access-st6dc\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.564066 master-0 kubenswrapper[13205]: I0319 09:24:23.563901 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4869583f-43af-4ec9-8dea-1da1634816dc-snapshots\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.664746 master-0 kubenswrapper[13205]: I0319 09:24:23.664681 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4869583f-43af-4ec9-8dea-1da1634816dc-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.664746 master-0 kubenswrapper[13205]: I0319 09:24:23.664727 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4869583f-43af-4ec9-8dea-1da1634816dc-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.664746 master-0 kubenswrapper[13205]: I0319 09:24:23.664749 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpt9m\" (UniqueName: \"kubernetes.io/projected/7c70267e-b555-4d56-92e4-f24b65b61283-kube-api-access-rpt9m\") pod \"openshift-config-operator-95bf4f4d-rfnfj\" (UID: \"7c70267e-b555-4d56-92e4-f24b65b61283\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:24:23.665044 master-0 kubenswrapper[13205]: I0319 09:24:23.664796 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4869583f-43af-4ec9-8dea-1da1634816dc-serving-cert\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.665044 master-0 kubenswrapper[13205]: I0319 09:24:23.664816 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4869583f-43af-4ec9-8dea-1da1634816dc-snapshots\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.665044 master-0 kubenswrapper[13205]: I0319 09:24:23.664830 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st6dc\" (UniqueName: \"kubernetes.io/projected/4869583f-43af-4ec9-8dea-1da1634816dc-kube-api-access-st6dc\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.665044 master-0 kubenswrapper[13205]: I0319 09:24:23.664856 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7c70267e-b555-4d56-92e4-f24b65b61283-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-rfnfj\" (UID: \"7c70267e-b555-4d56-92e4-f24b65b61283\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:24:23.665044 master-0 kubenswrapper[13205]: I0319 09:24:23.664875 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c70267e-b555-4d56-92e4-f24b65b61283-serving-cert\") pod \"openshift-config-operator-95bf4f4d-rfnfj\" (UID: \"7c70267e-b555-4d56-92e4-f24b65b61283\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:24:23.666085 master-0 kubenswrapper[13205]: I0319 09:24:23.666059 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7c70267e-b555-4d56-92e4-f24b65b61283-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-rfnfj\" (UID: \"7c70267e-b555-4d56-92e4-f24b65b61283\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:24:23.666225 master-0 kubenswrapper[13205]: I0319 09:24:23.666204 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4869583f-43af-4ec9-8dea-1da1634816dc-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.666714 master-0 kubenswrapper[13205]: I0319 09:24:23.666664 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4869583f-43af-4ec9-8dea-1da1634816dc-snapshots\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.667498 master-0 kubenswrapper[13205]: I0319 09:24:23.667458 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4869583f-43af-4ec9-8dea-1da1634816dc-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.671387 master-0 kubenswrapper[13205]: I0319 09:24:23.671288 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c70267e-b555-4d56-92e4-f24b65b61283-serving-cert\") pod \"openshift-config-operator-95bf4f4d-rfnfj\" (UID: \"7c70267e-b555-4d56-92e4-f24b65b61283\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:24:23.671496 master-0 kubenswrapper[13205]: I0319 09:24:23.671433 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4869583f-43af-4ec9-8dea-1da1634816dc-serving-cert\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:23.899636 master-0 kubenswrapper[13205]: I0319 09:24:23.899562 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5"] Mar 19 09:24:23.900705 master-0 kubenswrapper[13205]: I0319 09:24:23.900676 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:23.901096 master-0 kubenswrapper[13205]: I0319 09:24:23.900681 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn"] Mar 19 09:24:23.903394 master-0 kubenswrapper[13205]: I0319 09:24:23.903365 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-4qq6m"] Mar 19 09:24:23.903505 master-0 kubenswrapper[13205]: I0319 09:24:23.903465 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" Mar 19 09:24:23.904038 master-0 kubenswrapper[13205]: I0319 09:24:23.903977 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 09:24:23.904122 master-0 kubenswrapper[13205]: I0319 09:24:23.904078 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-45zpl" Mar 19 09:24:23.906820 master-0 kubenswrapper[13205]: I0319 09:24:23.904752 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj"] Mar 19 09:24:23.911587 master-0 kubenswrapper[13205]: I0319 09:24:23.909875 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-tv2z8" Mar 19 09:24:23.911587 master-0 kubenswrapper[13205]: I0319 09:24:23.910187 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 09:24:23.911587 master-0 kubenswrapper[13205]: I0319 09:24:23.910565 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 09:24:23.911587 master-0 kubenswrapper[13205]: I0319 09:24:23.910731 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 09:24:23.911587 master-0 kubenswrapper[13205]: I0319 09:24:23.911218 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 09:24:23.911587 master-0 kubenswrapper[13205]: I0319 09:24:23.911243 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 09:24:23.911587 master-0 kubenswrapper[13205]: I0319 09:24:23.911434 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 09:24:23.911958 master-0 kubenswrapper[13205]: I0319 09:24:23.911664 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 09:24:23.935342 master-0 kubenswrapper[13205]: I0319 09:24:23.935279 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpt9m\" (UniqueName: \"kubernetes.io/projected/7c70267e-b555-4d56-92e4-f24b65b61283-kube-api-access-rpt9m\") pod \"openshift-config-operator-95bf4f4d-rfnfj\" (UID: \"7c70267e-b555-4d56-92e4-f24b65b61283\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:24:23.968033 master-0 kubenswrapper[13205]: I0319 09:24:23.967982 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/87b757ff-ca45-4dc7-b31f-ccca53cb2354-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:23.968192 master-0 kubenswrapper[13205]: I0319 09:24:23.968045 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/87b757ff-ca45-4dc7-b31f-ccca53cb2354-images\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:23.968192 master-0 kubenswrapper[13205]: I0319 09:24:23.968078 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6904be4c-4f5f-4176-8100-7b6955c6d8da-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-rsnsn\" (UID: \"6904be4c-4f5f-4176-8100-7b6955c6d8da\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" Mar 19 09:24:23.968192 master-0 kubenswrapper[13205]: I0319 09:24:23.968126 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87b757ff-ca45-4dc7-b31f-ccca53cb2354-cert\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:23.968192 master-0 kubenswrapper[13205]: I0319 09:24:23.968187 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbkrk\" (UniqueName: \"kubernetes.io/projected/87b757ff-ca45-4dc7-b31f-ccca53cb2354-kube-api-access-qbkrk\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:23.968335 master-0 kubenswrapper[13205]: I0319 09:24:23.968267 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22x4r\" (UniqueName: \"kubernetes.io/projected/6904be4c-4f5f-4176-8100-7b6955c6d8da-kube-api-access-22x4r\") pod \"cluster-autoscaler-operator-866dc4744-rsnsn\" (UID: \"6904be4c-4f5f-4176-8100-7b6955c6d8da\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" Mar 19 09:24:23.968335 master-0 kubenswrapper[13205]: I0319 09:24:23.968305 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87b757ff-ca45-4dc7-b31f-ccca53cb2354-config\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:23.968389 master-0 kubenswrapper[13205]: I0319 09:24:23.968361 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6904be4c-4f5f-4176-8100-7b6955c6d8da-cert\") pod \"cluster-autoscaler-operator-866dc4744-rsnsn\" (UID: \"6904be4c-4f5f-4176-8100-7b6955c6d8da\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" Mar 19 09:24:24.069201 master-0 kubenswrapper[13205]: I0319 09:24:24.069144 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87b757ff-ca45-4dc7-b31f-ccca53cb2354-config\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:24.069432 master-0 kubenswrapper[13205]: I0319 09:24:24.069235 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6904be4c-4f5f-4176-8100-7b6955c6d8da-cert\") pod \"cluster-autoscaler-operator-866dc4744-rsnsn\" (UID: \"6904be4c-4f5f-4176-8100-7b6955c6d8da\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" Mar 19 09:24:24.069432 master-0 kubenswrapper[13205]: I0319 09:24:24.069391 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/87b757ff-ca45-4dc7-b31f-ccca53cb2354-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:24.069432 master-0 kubenswrapper[13205]: I0319 09:24:24.069427 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/87b757ff-ca45-4dc7-b31f-ccca53cb2354-images\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:24.069709 master-0 kubenswrapper[13205]: I0319 09:24:24.069665 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6904be4c-4f5f-4176-8100-7b6955c6d8da-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-rsnsn\" (UID: \"6904be4c-4f5f-4176-8100-7b6955c6d8da\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" Mar 19 09:24:24.069780 master-0 kubenswrapper[13205]: I0319 09:24:24.069712 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87b757ff-ca45-4dc7-b31f-ccca53cb2354-cert\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:24.069780 master-0 kubenswrapper[13205]: I0319 09:24:24.069752 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbkrk\" (UniqueName: \"kubernetes.io/projected/87b757ff-ca45-4dc7-b31f-ccca53cb2354-kube-api-access-qbkrk\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:24.069928 master-0 kubenswrapper[13205]: I0319 09:24:24.069903 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22x4r\" (UniqueName: \"kubernetes.io/projected/6904be4c-4f5f-4176-8100-7b6955c6d8da-kube-api-access-22x4r\") pod \"cluster-autoscaler-operator-866dc4744-rsnsn\" (UID: \"6904be4c-4f5f-4176-8100-7b6955c6d8da\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" Mar 19 09:24:24.070579 master-0 kubenswrapper[13205]: I0319 09:24:24.070492 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87b757ff-ca45-4dc7-b31f-ccca53cb2354-config\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:24.070738 master-0 kubenswrapper[13205]: I0319 09:24:24.070701 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/87b757ff-ca45-4dc7-b31f-ccca53cb2354-images\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:24.071221 master-0 kubenswrapper[13205]: I0319 09:24:24.071109 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6904be4c-4f5f-4176-8100-7b6955c6d8da-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-rsnsn\" (UID: \"6904be4c-4f5f-4176-8100-7b6955c6d8da\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" Mar 19 09:24:24.073026 master-0 kubenswrapper[13205]: I0319 09:24:24.072975 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87b757ff-ca45-4dc7-b31f-ccca53cb2354-cert\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:24.073495 master-0 kubenswrapper[13205]: I0319 09:24:24.073444 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6904be4c-4f5f-4176-8100-7b6955c6d8da-cert\") pod \"cluster-autoscaler-operator-866dc4744-rsnsn\" (UID: \"6904be4c-4f5f-4176-8100-7b6955c6d8da\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" Mar 19 09:24:24.073754 master-0 kubenswrapper[13205]: I0319 09:24:24.073693 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/87b757ff-ca45-4dc7-b31f-ccca53cb2354-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:24.165288 master-0 kubenswrapper[13205]: I0319 09:24:24.165198 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:24:24.320951 master-0 kubenswrapper[13205]: I0319 09:24:24.317609 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5"] Mar 19 09:24:24.320951 master-0 kubenswrapper[13205]: I0319 09:24:24.320799 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq"] Mar 19 09:24:24.329127 master-0 kubenswrapper[13205]: I0319 09:24:24.324843 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j"] Mar 19 09:24:24.331512 master-0 kubenswrapper[13205]: I0319 09:24:24.330284 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn"] Mar 19 09:24:24.333181 master-0 kubenswrapper[13205]: I0319 09:24:24.332828 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x"] Mar 19 09:24:24.338100 master-0 kubenswrapper[13205]: W0319 09:24:24.338065 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6775d7ec_8114_4fc3_a23d_d5ac910f3285.slice/crio-2e2e2c717b4a44fb2d8cb369436c75d1bf67993f52e0f131e47f5f570f0b95db WatchSource:0}: Error finding container 2e2e2c717b4a44fb2d8cb369436c75d1bf67993f52e0f131e47f5f570f0b95db: Status 404 returned error can't find the container with id 2e2e2c717b4a44fb2d8cb369436c75d1bf67993f52e0f131e47f5f570f0b95db Mar 19 09:24:24.342104 master-0 kubenswrapper[13205]: I0319 09:24:24.341543 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st6dc\" (UniqueName: \"kubernetes.io/projected/4869583f-43af-4ec9-8dea-1da1634816dc-kube-api-access-st6dc\") pod \"insights-operator-68bf6ff9d6-4qq6m\" (UID: \"4869583f-43af-4ec9-8dea-1da1634816dc\") " pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:24.383066 master-0 kubenswrapper[13205]: I0319 09:24:24.381081 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:24:24.400049 master-0 kubenswrapper[13205]: I0319 09:24:24.399994 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq" event={"ID":"ff50023c-0f3f-4506-b26f-9872d0eec45e","Type":"ContainerStarted","Data":"7e1de03d355303222c22646eb08caee8c8127e0b1bc34e140aedcfcdaca482c0"} Mar 19 09:24:24.401436 master-0 kubenswrapper[13205]: I0319 09:24:24.401399 13205 generic.go:334] "Generic (PLEG): container finished" podID="51b88818-5108-40db-90c8-4f2e7198959e" containerID="caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9" exitCode=0 Mar 19 09:24:24.401498 master-0 kubenswrapper[13205]: I0319 09:24:24.401447 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" event={"ID":"51b88818-5108-40db-90c8-4f2e7198959e","Type":"ContainerDied","Data":"caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9"} Mar 19 09:24:24.401498 master-0 kubenswrapper[13205]: I0319 09:24:24.401465 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" event={"ID":"51b88818-5108-40db-90c8-4f2e7198959e","Type":"ContainerDied","Data":"2451d7e3dd79303504d5964f5bc9fe498e3fce32e9bf236a0e1ab73d89c4fa39"} Mar 19 09:24:24.401498 master-0 kubenswrapper[13205]: I0319 09:24:24.401484 13205 scope.go:117] "RemoveContainer" containerID="caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9" Mar 19 09:24:24.401645 master-0 kubenswrapper[13205]: I0319 09:24:24.401618 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-prd2q" Mar 19 09:24:24.403254 master-0 kubenswrapper[13205]: I0319 09:24:24.403228 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" event={"ID":"6775d7ec-8114-4fc3-a23d-d5ac910f3285","Type":"ContainerStarted","Data":"2e2e2c717b4a44fb2d8cb369436c75d1bf67993f52e0f131e47f5f570f0b95db"} Mar 19 09:24:24.405668 master-0 kubenswrapper[13205]: I0319 09:24:24.405635 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" event={"ID":"e8ca673b-2a2f-4ecf-a142-7fe10fcac707","Type":"ContainerStarted","Data":"c62984f1d503fbf79de1a2056e7dd2bd9e769e2c69bf2b7304310d727d4c53bc"} Mar 19 09:24:24.421840 master-0 kubenswrapper[13205]: I0319 09:24:24.421800 13205 scope.go:117] "RemoveContainer" containerID="caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9" Mar 19 09:24:24.422346 master-0 kubenswrapper[13205]: E0319 09:24:24.422307 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9\": container with ID starting with caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9 not found: ID does not exist" containerID="caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9" Mar 19 09:24:24.422506 master-0 kubenswrapper[13205]: I0319 09:24:24.422351 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9"} err="failed to get container status \"caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9\": rpc error: code = NotFound desc = could not find container \"caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9\": container with ID starting with caeca672ddd1b5fe67c0e8945caaac1a7a870055be645895c7e398ffa52391b9 not found: ID does not exist" Mar 19 09:24:24.437266 master-0 kubenswrapper[13205]: I0319 09:24:24.437223 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" Mar 19 09:24:24.575406 master-0 kubenswrapper[13205]: I0319 09:24:24.574550 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51b88818-5108-40db-90c8-4f2e7198959e-service-ca\") pod \"51b88818-5108-40db-90c8-4f2e7198959e\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " Mar 19 09:24:24.575406 master-0 kubenswrapper[13205]: I0319 09:24:24.575092 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") pod \"51b88818-5108-40db-90c8-4f2e7198959e\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " Mar 19 09:24:24.576549 master-0 kubenswrapper[13205]: I0319 09:24:24.576196 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51b88818-5108-40db-90c8-4f2e7198959e-service-ca" (OuterVolumeSpecName: "service-ca") pod "51b88818-5108-40db-90c8-4f2e7198959e" (UID: "51b88818-5108-40db-90c8-4f2e7198959e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:24.576549 master-0 kubenswrapper[13205]: I0319 09:24:24.576341 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-ssl-certs\") pod \"51b88818-5108-40db-90c8-4f2e7198959e\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " Mar 19 09:24:24.576549 master-0 kubenswrapper[13205]: I0319 09:24:24.576406 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-cvo-updatepayloads\") pod \"51b88818-5108-40db-90c8-4f2e7198959e\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " Mar 19 09:24:24.576549 master-0 kubenswrapper[13205]: I0319 09:24:24.576488 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51b88818-5108-40db-90c8-4f2e7198959e-kube-api-access\") pod \"51b88818-5108-40db-90c8-4f2e7198959e\" (UID: \"51b88818-5108-40db-90c8-4f2e7198959e\") " Mar 19 09:24:24.577289 master-0 kubenswrapper[13205]: I0319 09:24:24.576866 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "51b88818-5108-40db-90c8-4f2e7198959e" (UID: "51b88818-5108-40db-90c8-4f2e7198959e"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:24.577289 master-0 kubenswrapper[13205]: I0319 09:24:24.576886 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "51b88818-5108-40db-90c8-4f2e7198959e" (UID: "51b88818-5108-40db-90c8-4f2e7198959e"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:24.577289 master-0 kubenswrapper[13205]: I0319 09:24:24.577119 13205 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51b88818-5108-40db-90c8-4f2e7198959e-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:24.577289 master-0 kubenswrapper[13205]: I0319 09:24:24.577156 13205 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:24.577289 master-0 kubenswrapper[13205]: I0319 09:24:24.577170 13205 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/51b88818-5108-40db-90c8-4f2e7198959e-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:24.583828 master-0 kubenswrapper[13205]: I0319 09:24:24.583773 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "51b88818-5108-40db-90c8-4f2e7198959e" (UID: "51b88818-5108-40db-90c8-4f2e7198959e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:24.588571 master-0 kubenswrapper[13205]: I0319 09:24:24.586999 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51b88818-5108-40db-90c8-4f2e7198959e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "51b88818-5108-40db-90c8-4f2e7198959e" (UID: "51b88818-5108-40db-90c8-4f2e7198959e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:24.678254 master-0 kubenswrapper[13205]: I0319 09:24:24.678199 13205 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51b88818-5108-40db-90c8-4f2e7198959e-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:24.678254 master-0 kubenswrapper[13205]: I0319 09:24:24.678238 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51b88818-5108-40db-90c8-4f2e7198959e-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:24.881104 master-0 kubenswrapper[13205]: I0319 09:24:24.881006 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:24.881508 master-0 kubenswrapper[13205]: E0319 09:24:24.881346 13205 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 19 09:24:24.881508 master-0 kubenswrapper[13205]: E0319 09:24:24.881506 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls podName:745093e5-ffe1-4443-b317-448948f3b311 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:28.88147745 +0000 UTC m=+54.213784338 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls") pod "dns-default-79jrh" (UID: "745093e5-ffe1-4443-b317-448948f3b311") : secret "dns-default-metrics-tls" not found Mar 19 09:24:24.916314 master-0 kubenswrapper[13205]: I0319 09:24:24.916266 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l"] Mar 19 09:24:24.916584 master-0 kubenswrapper[13205]: E0319 09:24:24.916554 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51b88818-5108-40db-90c8-4f2e7198959e" containerName="cluster-version-operator" Mar 19 09:24:24.916584 master-0 kubenswrapper[13205]: I0319 09:24:24.916575 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b88818-5108-40db-90c8-4f2e7198959e" containerName="cluster-version-operator" Mar 19 09:24:24.916705 master-0 kubenswrapper[13205]: I0319 09:24:24.916686 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="51b88818-5108-40db-90c8-4f2e7198959e" containerName="cluster-version-operator" Mar 19 09:24:24.917102 master-0 kubenswrapper[13205]: I0319 09:24:24.917069 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" Mar 19 09:24:24.921510 master-0 kubenswrapper[13205]: I0319 09:24:24.921458 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv"] Mar 19 09:24:24.922613 master-0 kubenswrapper[13205]: I0319 09:24:24.922585 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:24.924815 master-0 kubenswrapper[13205]: I0319 09:24:24.924636 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-99v25" Mar 19 09:24:24.924815 master-0 kubenswrapper[13205]: I0319 09:24:24.924745 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 09:24:24.925147 master-0 kubenswrapper[13205]: I0319 09:24:24.925002 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 09:24:24.925147 master-0 kubenswrapper[13205]: I0319 09:24:24.925073 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 09:24:24.925147 master-0 kubenswrapper[13205]: I0319 09:24:24.925135 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-khk7h" Mar 19 09:24:24.925411 master-0 kubenswrapper[13205]: I0319 09:24:24.925381 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 09:24:24.982197 master-0 kubenswrapper[13205]: I0319 09:24:24.982087 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a0f93ac-a77b-488a-bcc4-a45702a9e32d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-v9n8l\" (UID: \"9a0f93ac-a77b-488a-bcc4-a45702a9e32d\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" Mar 19 09:24:24.982197 master-0 kubenswrapper[13205]: I0319 09:24:24.982134 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c10d0e00-cf19-4067-b7bf-ff569f2f3d71-images\") pod \"machine-api-operator-6fbb6cf6f9-nmvcv\" (UID: \"c10d0e00-cf19-4067-b7bf-ff569f2f3d71\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:24.982197 master-0 kubenswrapper[13205]: I0319 09:24:24.982168 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj5r8\" (UniqueName: \"kubernetes.io/projected/c10d0e00-cf19-4067-b7bf-ff569f2f3d71-kube-api-access-kj5r8\") pod \"machine-api-operator-6fbb6cf6f9-nmvcv\" (UID: \"c10d0e00-cf19-4067-b7bf-ff569f2f3d71\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:24.982463 master-0 kubenswrapper[13205]: I0319 09:24:24.982208 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g64fj\" (UniqueName: \"kubernetes.io/projected/9a0f93ac-a77b-488a-bcc4-a45702a9e32d-kube-api-access-g64fj\") pod \"control-plane-machine-set-operator-6f97756bc8-v9n8l\" (UID: \"9a0f93ac-a77b-488a-bcc4-a45702a9e32d\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" Mar 19 09:24:24.982463 master-0 kubenswrapper[13205]: I0319 09:24:24.982253 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c10d0e00-cf19-4067-b7bf-ff569f2f3d71-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-nmvcv\" (UID: \"c10d0e00-cf19-4067-b7bf-ff569f2f3d71\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:24.982463 master-0 kubenswrapper[13205]: I0319 09:24:24.982269 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c10d0e00-cf19-4067-b7bf-ff569f2f3d71-config\") pod \"machine-api-operator-6fbb6cf6f9-nmvcv\" (UID: \"c10d0e00-cf19-4067-b7bf-ff569f2f3d71\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:25.084594 master-0 kubenswrapper[13205]: I0319 09:24:25.080038 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s"] Mar 19 09:24:25.084594 master-0 kubenswrapper[13205]: I0319 09:24:25.081026 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:25.084594 master-0 kubenswrapper[13205]: I0319 09:24:25.082991 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g64fj\" (UniqueName: \"kubernetes.io/projected/9a0f93ac-a77b-488a-bcc4-a45702a9e32d-kube-api-access-g64fj\") pod \"control-plane-machine-set-operator-6f97756bc8-v9n8l\" (UID: \"9a0f93ac-a77b-488a-bcc4-a45702a9e32d\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" Mar 19 09:24:25.084594 master-0 kubenswrapper[13205]: I0319 09:24:25.083058 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c10d0e00-cf19-4067-b7bf-ff569f2f3d71-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-nmvcv\" (UID: \"c10d0e00-cf19-4067-b7bf-ff569f2f3d71\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:25.084594 master-0 kubenswrapper[13205]: I0319 09:24:25.083080 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c10d0e00-cf19-4067-b7bf-ff569f2f3d71-config\") pod \"machine-api-operator-6fbb6cf6f9-nmvcv\" (UID: \"c10d0e00-cf19-4067-b7bf-ff569f2f3d71\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:25.084594 master-0 kubenswrapper[13205]: I0319 09:24:25.083111 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c10d0e00-cf19-4067-b7bf-ff569f2f3d71-images\") pod \"machine-api-operator-6fbb6cf6f9-nmvcv\" (UID: \"c10d0e00-cf19-4067-b7bf-ff569f2f3d71\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:25.084594 master-0 kubenswrapper[13205]: I0319 09:24:25.083126 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a0f93ac-a77b-488a-bcc4-a45702a9e32d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-v9n8l\" (UID: \"9a0f93ac-a77b-488a-bcc4-a45702a9e32d\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" Mar 19 09:24:25.084594 master-0 kubenswrapper[13205]: I0319 09:24:25.083149 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj5r8\" (UniqueName: \"kubernetes.io/projected/c10d0e00-cf19-4067-b7bf-ff569f2f3d71-kube-api-access-kj5r8\") pod \"machine-api-operator-6fbb6cf6f9-nmvcv\" (UID: \"c10d0e00-cf19-4067-b7bf-ff569f2f3d71\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:25.090925 master-0 kubenswrapper[13205]: I0319 09:24:25.090813 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 09:24:25.090925 master-0 kubenswrapper[13205]: I0319 09:24:25.090903 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c10d0e00-cf19-4067-b7bf-ff569f2f3d71-config\") pod \"machine-api-operator-6fbb6cf6f9-nmvcv\" (UID: \"c10d0e00-cf19-4067-b7bf-ff569f2f3d71\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:25.091209 master-0 kubenswrapper[13205]: I0319 09:24:25.091009 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 09:24:25.091269 master-0 kubenswrapper[13205]: I0319 09:24:25.091221 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 09:24:25.091437 master-0 kubenswrapper[13205]: I0319 09:24:25.091370 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 09:24:25.091548 master-0 kubenswrapper[13205]: I0319 09:24:25.091500 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-mvv8v" Mar 19 09:24:25.091713 master-0 kubenswrapper[13205]: I0319 09:24:25.091519 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c10d0e00-cf19-4067-b7bf-ff569f2f3d71-images\") pod \"machine-api-operator-6fbb6cf6f9-nmvcv\" (UID: \"c10d0e00-cf19-4067-b7bf-ff569f2f3d71\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:25.091713 master-0 kubenswrapper[13205]: I0319 09:24:25.091513 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 09:24:25.094858 master-0 kubenswrapper[13205]: I0319 09:24:25.094230 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l"] Mar 19 09:24:25.094858 master-0 kubenswrapper[13205]: I0319 09:24:25.094466 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a0f93ac-a77b-488a-bcc4-a45702a9e32d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-v9n8l\" (UID: \"9a0f93ac-a77b-488a-bcc4-a45702a9e32d\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" Mar 19 09:24:25.096126 master-0 kubenswrapper[13205]: I0319 09:24:25.095959 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c10d0e00-cf19-4067-b7bf-ff569f2f3d71-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-nmvcv\" (UID: \"c10d0e00-cf19-4067-b7bf-ff569f2f3d71\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:25.098218 master-0 kubenswrapper[13205]: I0319 09:24:25.098189 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv"] Mar 19 09:24:25.102805 master-0 kubenswrapper[13205]: I0319 09:24:25.102731 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbkrk\" (UniqueName: \"kubernetes.io/projected/87b757ff-ca45-4dc7-b31f-ccca53cb2354-kube-api-access-qbkrk\") pod \"cluster-baremetal-operator-6f69995874-nf2m5\" (UID: \"87b757ff-ca45-4dc7-b31f-ccca53cb2354\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:25.106171 master-0 kubenswrapper[13205]: I0319 09:24:25.106150 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22x4r\" (UniqueName: \"kubernetes.io/projected/6904be4c-4f5f-4176-8100-7b6955c6d8da-kube-api-access-22x4r\") pod \"cluster-autoscaler-operator-866dc4744-rsnsn\" (UID: \"6904be4c-4f5f-4176-8100-7b6955c6d8da\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" Mar 19 09:24:25.139584 master-0 kubenswrapper[13205]: I0319 09:24:25.139424 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" Mar 19 09:24:25.153154 master-0 kubenswrapper[13205]: I0319 09:24:25.153090 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" Mar 19 09:24:25.185475 master-0 kubenswrapper[13205]: I0319 09:24:25.185399 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6a0424dd-a9fc-4763-b30c-884076dd64aa-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-lvh8s\" (UID: \"6a0424dd-a9fc-4763-b30c-884076dd64aa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:25.185808 master-0 kubenswrapper[13205]: I0319 09:24:25.185520 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q86v\" (UniqueName: \"kubernetes.io/projected/6a0424dd-a9fc-4763-b30c-884076dd64aa-kube-api-access-9q86v\") pod \"machine-config-operator-84d549f6d5-lvh8s\" (UID: \"6a0424dd-a9fc-4763-b30c-884076dd64aa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:25.185808 master-0 kubenswrapper[13205]: I0319 09:24:25.185597 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6a0424dd-a9fc-4763-b30c-884076dd64aa-images\") pod \"machine-config-operator-84d549f6d5-lvh8s\" (UID: \"6a0424dd-a9fc-4763-b30c-884076dd64aa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:25.185928 master-0 kubenswrapper[13205]: I0319 09:24:25.185843 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a0424dd-a9fc-4763-b30c-884076dd64aa-proxy-tls\") pod \"machine-config-operator-84d549f6d5-lvh8s\" (UID: \"6a0424dd-a9fc-4763-b30c-884076dd64aa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:25.286582 master-0 kubenswrapper[13205]: I0319 09:24:25.286543 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cvtc\" (UniqueName: \"kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:25.286710 master-0 kubenswrapper[13205]: I0319 09:24:25.286607 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a0424dd-a9fc-4763-b30c-884076dd64aa-proxy-tls\") pod \"machine-config-operator-84d549f6d5-lvh8s\" (UID: \"6a0424dd-a9fc-4763-b30c-884076dd64aa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:25.286833 master-0 kubenswrapper[13205]: E0319 09:24:25.286801 13205 projected.go:288] Couldn't get configMap openshift-dns/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 19 09:24:25.286884 master-0 kubenswrapper[13205]: E0319 09:24:25.286852 13205 projected.go:194] Error preparing data for projected volume kube-api-access-5cvtc for pod openshift-dns/dns-default-79jrh: configmap "kube-root-ca.crt" not found Mar 19 09:24:25.287087 master-0 kubenswrapper[13205]: E0319 09:24:25.286911 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc podName:745093e5-ffe1-4443-b317-448948f3b311 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:29.286890294 +0000 UTC m=+54.619197242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5cvtc" (UniqueName: "kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc") pod "dns-default-79jrh" (UID: "745093e5-ffe1-4443-b317-448948f3b311") : configmap "kube-root-ca.crt" not found Mar 19 09:24:25.287087 master-0 kubenswrapper[13205]: I0319 09:24:25.286950 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6a0424dd-a9fc-4763-b30c-884076dd64aa-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-lvh8s\" (UID: \"6a0424dd-a9fc-4763-b30c-884076dd64aa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:25.287087 master-0 kubenswrapper[13205]: I0319 09:24:25.287018 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q86v\" (UniqueName: \"kubernetes.io/projected/6a0424dd-a9fc-4763-b30c-884076dd64aa-kube-api-access-9q86v\") pod \"machine-config-operator-84d549f6d5-lvh8s\" (UID: \"6a0424dd-a9fc-4763-b30c-884076dd64aa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:25.287087 master-0 kubenswrapper[13205]: I0319 09:24:25.287050 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6a0424dd-a9fc-4763-b30c-884076dd64aa-images\") pod \"machine-config-operator-84d549f6d5-lvh8s\" (UID: \"6a0424dd-a9fc-4763-b30c-884076dd64aa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:25.288353 master-0 kubenswrapper[13205]: I0319 09:24:25.288331 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6a0424dd-a9fc-4763-b30c-884076dd64aa-images\") pod \"machine-config-operator-84d549f6d5-lvh8s\" (UID: \"6a0424dd-a9fc-4763-b30c-884076dd64aa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:25.288700 master-0 kubenswrapper[13205]: I0319 09:24:25.288678 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6a0424dd-a9fc-4763-b30c-884076dd64aa-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-lvh8s\" (UID: \"6a0424dd-a9fc-4763-b30c-884076dd64aa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:25.290568 master-0 kubenswrapper[13205]: I0319 09:24:25.290486 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a0424dd-a9fc-4763-b30c-884076dd64aa-proxy-tls\") pod \"machine-config-operator-84d549f6d5-lvh8s\" (UID: \"6a0424dd-a9fc-4763-b30c-884076dd64aa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:25.413093 master-0 kubenswrapper[13205]: I0319 09:24:25.412668 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j" event={"ID":"0e517f77-ec32-4376-b1ff-88ec24a22e3e","Type":"ContainerStarted","Data":"9c0293d3836623b2c0fd34c5d268d92a6d946aeea1b58c446ec612c49622a8f1"} Mar 19 09:24:25.415788 master-0 kubenswrapper[13205]: I0319 09:24:25.415752 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" event={"ID":"6775d7ec-8114-4fc3-a23d-d5ac910f3285","Type":"ContainerStarted","Data":"1ce650b514118d5c910e4fdb7242169a357e184bcfac13688351fa8c06c54f2a"} Mar 19 09:24:25.771618 master-0 kubenswrapper[13205]: I0319 09:24:25.770911 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj"] Mar 19 09:24:25.771618 master-0 kubenswrapper[13205]: I0319 09:24:25.770955 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s"] Mar 19 09:24:25.771618 master-0 kubenswrapper[13205]: I0319 09:24:25.770964 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-4qq6m"] Mar 19 09:24:25.783628 master-0 kubenswrapper[13205]: W0319 09:24:25.783594 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4869583f_43af_4ec9_8dea_1da1634816dc.slice/crio-74e3a690e2e2d0f7ef6e07f092d440cadfd9d95480401627334b6b3e446e07f5 WatchSource:0}: Error finding container 74e3a690e2e2d0f7ef6e07f092d440cadfd9d95480401627334b6b3e446e07f5: Status 404 returned error can't find the container with id 74e3a690e2e2d0f7ef6e07f092d440cadfd9d95480401627334b6b3e446e07f5 Mar 19 09:24:25.785109 master-0 kubenswrapper[13205]: W0319 09:24:25.785063 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c70267e_b555_4d56_92e4_f24b65b61283.slice/crio-c0321b229564733b20e86fd672509c47d0b9054d611c69a0cc0b4e2742659844 WatchSource:0}: Error finding container c0321b229564733b20e86fd672509c47d0b9054d611c69a0cc0b4e2742659844: Status 404 returned error can't find the container with id c0321b229564733b20e86fd672509c47d0b9054d611c69a0cc0b4e2742659844 Mar 19 09:24:26.370261 master-0 kubenswrapper[13205]: I0319 09:24:26.369827 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-64f78496dd-kwdfq"] Mar 19 09:24:26.371717 master-0 kubenswrapper[13205]: I0319 09:24:26.371681 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-64f78496dd-kwdfq" Mar 19 09:24:26.388772 master-0 kubenswrapper[13205]: I0319 09:24:26.384765 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-zbzf5" Mar 19 09:24:26.400980 master-0 kubenswrapper[13205]: I0319 09:24:26.400941 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5"] Mar 19 09:24:26.404512 master-0 kubenswrapper[13205]: I0319 09:24:26.404470 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g64fj\" (UniqueName: \"kubernetes.io/projected/9a0f93ac-a77b-488a-bcc4-a45702a9e32d-kube-api-access-g64fj\") pod \"control-plane-machine-set-operator-6f97756bc8-v9n8l\" (UID: \"9a0f93ac-a77b-488a-bcc4-a45702a9e32d\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" Mar 19 09:24:26.405629 master-0 kubenswrapper[13205]: I0319 09:24:26.405588 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/165e3498-b49e-42fa-a614-0680f8c93fc7-webhook-certs\") pod \"multus-admission-controller-64f78496dd-kwdfq\" (UID: \"165e3498-b49e-42fa-a614-0680f8c93fc7\") " pod="openshift-multus/multus-admission-controller-64f78496dd-kwdfq" Mar 19 09:24:26.405714 master-0 kubenswrapper[13205]: I0319 09:24:26.405694 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tqf4\" (UniqueName: \"kubernetes.io/projected/165e3498-b49e-42fa-a614-0680f8c93fc7-kube-api-access-8tqf4\") pod \"multus-admission-controller-64f78496dd-kwdfq\" (UID: \"165e3498-b49e-42fa-a614-0680f8c93fc7\") " pod="openshift-multus/multus-admission-controller-64f78496dd-kwdfq" Mar 19 09:24:26.409056 master-0 kubenswrapper[13205]: I0319 09:24:26.408990 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj5r8\" (UniqueName: \"kubernetes.io/projected/c10d0e00-cf19-4067-b7bf-ff569f2f3d71-kube-api-access-kj5r8\") pod \"machine-api-operator-6fbb6cf6f9-nmvcv\" (UID: \"c10d0e00-cf19-4067-b7bf-ff569f2f3d71\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:26.409833 master-0 kubenswrapper[13205]: I0319 09:24:26.409793 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn"] Mar 19 09:24:26.428831 master-0 kubenswrapper[13205]: I0319 09:24:26.428795 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q86v\" (UniqueName: \"kubernetes.io/projected/6a0424dd-a9fc-4763-b30c-884076dd64aa-kube-api-access-9q86v\") pod \"machine-config-operator-84d549f6d5-lvh8s\" (UID: \"6a0424dd-a9fc-4763-b30c-884076dd64aa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:26.434902 master-0 kubenswrapper[13205]: I0319 09:24:26.434844 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" event={"ID":"4869583f-43af-4ec9-8dea-1da1634816dc","Type":"ContainerStarted","Data":"74e3a690e2e2d0f7ef6e07f092d440cadfd9d95480401627334b6b3e446e07f5"} Mar 19 09:24:26.436452 master-0 kubenswrapper[13205]: I0319 09:24:26.436404 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" event={"ID":"7c70267e-b555-4d56-92e4-f24b65b61283","Type":"ContainerStarted","Data":"c0321b229564733b20e86fd672509c47d0b9054d611c69a0cc0b4e2742659844"} Mar 19 09:24:26.438228 master-0 kubenswrapper[13205]: I0319 09:24:26.438194 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" event={"ID":"87b757ff-ca45-4dc7-b31f-ccca53cb2354","Type":"ContainerStarted","Data":"1cb458697a8d06a05f328a3ae87da5190165c5292077b4b5cc67c27ade9e41cb"} Mar 19 09:24:26.439273 master-0 kubenswrapper[13205]: I0319 09:24:26.439245 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" event={"ID":"6904be4c-4f5f-4176-8100-7b6955c6d8da","Type":"ContainerStarted","Data":"985c0dabd6fcee23f7c57ebfff33a145d5e966edf73015aa6da8698ec4b4341a"} Mar 19 09:24:26.453197 master-0 kubenswrapper[13205]: I0319 09:24:26.453154 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" Mar 19 09:24:26.468776 master-0 kubenswrapper[13205]: I0319 09:24:26.468729 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" Mar 19 09:24:26.506973 master-0 kubenswrapper[13205]: I0319 09:24:26.506691 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/165e3498-b49e-42fa-a614-0680f8c93fc7-webhook-certs\") pod \"multus-admission-controller-64f78496dd-kwdfq\" (UID: \"165e3498-b49e-42fa-a614-0680f8c93fc7\") " pod="openshift-multus/multus-admission-controller-64f78496dd-kwdfq" Mar 19 09:24:26.506973 master-0 kubenswrapper[13205]: I0319 09:24:26.506771 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tqf4\" (UniqueName: \"kubernetes.io/projected/165e3498-b49e-42fa-a614-0680f8c93fc7-kube-api-access-8tqf4\") pod \"multus-admission-controller-64f78496dd-kwdfq\" (UID: \"165e3498-b49e-42fa-a614-0680f8c93fc7\") " pod="openshift-multus/multus-admission-controller-64f78496dd-kwdfq" Mar 19 09:24:26.512255 master-0 kubenswrapper[13205]: I0319 09:24:26.512228 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/165e3498-b49e-42fa-a614-0680f8c93fc7-webhook-certs\") pod \"multus-admission-controller-64f78496dd-kwdfq\" (UID: \"165e3498-b49e-42fa-a614-0680f8c93fc7\") " pod="openshift-multus/multus-admission-controller-64f78496dd-kwdfq" Mar 19 09:24:26.619546 master-0 kubenswrapper[13205]: I0319 09:24:26.616676 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" Mar 19 09:24:26.855003 master-0 kubenswrapper[13205]: I0319 09:24:26.854959 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-64f78496dd-kwdfq"] Mar 19 09:24:27.138891 master-0 kubenswrapper[13205]: I0319 09:24:27.137042 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw"] Mar 19 09:24:27.138891 master-0 kubenswrapper[13205]: I0319 09:24:27.137774 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.171585 master-0 kubenswrapper[13205]: I0319 09:24:27.166031 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-lpljf" Mar 19 09:24:27.181571 master-0 kubenswrapper[13205]: I0319 09:24:27.176913 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 09:24:27.197668 master-0 kubenswrapper[13205]: I0319 09:24:27.197368 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tqf4\" (UniqueName: \"kubernetes.io/projected/165e3498-b49e-42fa-a614-0680f8c93fc7-kube-api-access-8tqf4\") pod \"multus-admission-controller-64f78496dd-kwdfq\" (UID: \"165e3498-b49e-42fa-a614-0680f8c93fc7\") " pod="openshift-multus/multus-admission-controller-64f78496dd-kwdfq" Mar 19 09:24:27.213509 master-0 kubenswrapper[13205]: I0319 09:24:27.213456 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw"] Mar 19 09:24:27.217796 master-0 kubenswrapper[13205]: I0319 09:24:27.216916 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtnnf\" (UniqueName: \"kubernetes.io/projected/588cf947-93a7-4e1d-b2fe-a281cb4eb44e-kube-api-access-xtnnf\") pod \"packageserver-6f5bddd45b-hzcnw\" (UID: \"588cf947-93a7-4e1d-b2fe-a281cb4eb44e\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.217796 master-0 kubenswrapper[13205]: I0319 09:24:27.216983 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/588cf947-93a7-4e1d-b2fe-a281cb4eb44e-tmpfs\") pod \"packageserver-6f5bddd45b-hzcnw\" (UID: \"588cf947-93a7-4e1d-b2fe-a281cb4eb44e\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.217796 master-0 kubenswrapper[13205]: I0319 09:24:27.217016 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/588cf947-93a7-4e1d-b2fe-a281cb4eb44e-webhook-cert\") pod \"packageserver-6f5bddd45b-hzcnw\" (UID: \"588cf947-93a7-4e1d-b2fe-a281cb4eb44e\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.217796 master-0 kubenswrapper[13205]: I0319 09:24:27.217036 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/588cf947-93a7-4e1d-b2fe-a281cb4eb44e-apiservice-cert\") pod \"packageserver-6f5bddd45b-hzcnw\" (UID: \"588cf947-93a7-4e1d-b2fe-a281cb4eb44e\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.237708 master-0 kubenswrapper[13205]: I0319 09:24:27.237643 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l"] Mar 19 09:24:27.260549 master-0 kubenswrapper[13205]: I0319 09:24:27.253640 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv"] Mar 19 09:24:27.262453 master-0 kubenswrapper[13205]: I0319 09:24:27.262401 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s"] Mar 19 09:24:27.310912 master-0 kubenswrapper[13205]: I0319 09:24:27.310478 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-64f78496dd-kwdfq" Mar 19 09:24:27.319612 master-0 kubenswrapper[13205]: I0319 09:24:27.319549 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtnnf\" (UniqueName: \"kubernetes.io/projected/588cf947-93a7-4e1d-b2fe-a281cb4eb44e-kube-api-access-xtnnf\") pod \"packageserver-6f5bddd45b-hzcnw\" (UID: \"588cf947-93a7-4e1d-b2fe-a281cb4eb44e\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.319693 master-0 kubenswrapper[13205]: I0319 09:24:27.319614 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/588cf947-93a7-4e1d-b2fe-a281cb4eb44e-tmpfs\") pod \"packageserver-6f5bddd45b-hzcnw\" (UID: \"588cf947-93a7-4e1d-b2fe-a281cb4eb44e\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.319693 master-0 kubenswrapper[13205]: I0319 09:24:27.319649 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/588cf947-93a7-4e1d-b2fe-a281cb4eb44e-webhook-cert\") pod \"packageserver-6f5bddd45b-hzcnw\" (UID: \"588cf947-93a7-4e1d-b2fe-a281cb4eb44e\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.319693 master-0 kubenswrapper[13205]: I0319 09:24:27.319673 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/588cf947-93a7-4e1d-b2fe-a281cb4eb44e-apiservice-cert\") pod \"packageserver-6f5bddd45b-hzcnw\" (UID: \"588cf947-93a7-4e1d-b2fe-a281cb4eb44e\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.320168 master-0 kubenswrapper[13205]: I0319 09:24:27.320151 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/588cf947-93a7-4e1d-b2fe-a281cb4eb44e-tmpfs\") pod \"packageserver-6f5bddd45b-hzcnw\" (UID: \"588cf947-93a7-4e1d-b2fe-a281cb4eb44e\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.323447 master-0 kubenswrapper[13205]: I0319 09:24:27.323429 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/588cf947-93a7-4e1d-b2fe-a281cb4eb44e-apiservice-cert\") pod \"packageserver-6f5bddd45b-hzcnw\" (UID: \"588cf947-93a7-4e1d-b2fe-a281cb4eb44e\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.323984 master-0 kubenswrapper[13205]: I0319 09:24:27.323942 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/588cf947-93a7-4e1d-b2fe-a281cb4eb44e-webhook-cert\") pod \"packageserver-6f5bddd45b-hzcnw\" (UID: \"588cf947-93a7-4e1d-b2fe-a281cb4eb44e\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.396434 master-0 kubenswrapper[13205]: I0319 09:24:27.396327 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtnnf\" (UniqueName: \"kubernetes.io/projected/588cf947-93a7-4e1d-b2fe-a281cb4eb44e-kube-api-access-xtnnf\") pod \"packageserver-6f5bddd45b-hzcnw\" (UID: \"588cf947-93a7-4e1d-b2fe-a281cb4eb44e\") " pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.457288 master-0 kubenswrapper[13205]: I0319 09:24:27.457228 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" event={"ID":"6904be4c-4f5f-4176-8100-7b6955c6d8da","Type":"ContainerStarted","Data":"a6be730b3c49ecdd6f47f8d1af607528336dae198d4640ae4d4b03c7b44b3976"} Mar 19 09:24:27.468391 master-0 kubenswrapper[13205]: I0319 09:24:27.468005 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:27.852922 master-0 kubenswrapper[13205]: I0319 09:24:27.852732 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rb955" podStartSLOduration=7.852716752 podStartE2EDuration="7.852716752s" podCreationTimestamp="2026-03-19 09:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:24:27.852712342 +0000 UTC m=+53.185019230" watchObservedRunningTime="2026-03-19 09:24:27.852716752 +0000 UTC m=+53.185023640" Mar 19 09:24:27.943600 master-0 kubenswrapper[13205]: I0319 09:24:27.941267 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7llkw"] Mar 19 09:24:27.943600 master-0 kubenswrapper[13205]: I0319 09:24:27.942263 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7llkw" Mar 19 09:24:28.030706 master-0 kubenswrapper[13205]: I0319 09:24:28.028726 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt2r9\" (UniqueName: \"kubernetes.io/projected/89f3f27a-83eb-4cd9-b557-aeee15998793-kube-api-access-tt2r9\") pod \"node-resolver-7llkw\" (UID: \"89f3f27a-83eb-4cd9-b557-aeee15998793\") " pod="openshift-dns/node-resolver-7llkw" Mar 19 09:24:28.030706 master-0 kubenswrapper[13205]: I0319 09:24:28.028787 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/89f3f27a-83eb-4cd9-b557-aeee15998793-hosts-file\") pod \"node-resolver-7llkw\" (UID: \"89f3f27a-83eb-4cd9-b557-aeee15998793\") " pod="openshift-dns/node-resolver-7llkw" Mar 19 09:24:28.038640 master-0 kubenswrapper[13205]: I0319 09:24:28.036671 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-prd2q"] Mar 19 09:24:28.045732 master-0 kubenswrapper[13205]: I0319 09:24:28.045670 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-prd2q"] Mar 19 09:24:28.130402 master-0 kubenswrapper[13205]: I0319 09:24:28.130284 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt2r9\" (UniqueName: \"kubernetes.io/projected/89f3f27a-83eb-4cd9-b557-aeee15998793-kube-api-access-tt2r9\") pod \"node-resolver-7llkw\" (UID: \"89f3f27a-83eb-4cd9-b557-aeee15998793\") " pod="openshift-dns/node-resolver-7llkw" Mar 19 09:24:28.130402 master-0 kubenswrapper[13205]: I0319 09:24:28.130370 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/89f3f27a-83eb-4cd9-b557-aeee15998793-hosts-file\") pod \"node-resolver-7llkw\" (UID: \"89f3f27a-83eb-4cd9-b557-aeee15998793\") " pod="openshift-dns/node-resolver-7llkw" Mar 19 09:24:28.130760 master-0 kubenswrapper[13205]: I0319 09:24:28.130738 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/89f3f27a-83eb-4cd9-b557-aeee15998793-hosts-file\") pod \"node-resolver-7llkw\" (UID: \"89f3f27a-83eb-4cd9-b557-aeee15998793\") " pod="openshift-dns/node-resolver-7llkw" Mar 19 09:24:28.150008 master-0 kubenswrapper[13205]: I0319 09:24:28.149948 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt2r9\" (UniqueName: \"kubernetes.io/projected/89f3f27a-83eb-4cd9-b557-aeee15998793-kube-api-access-tt2r9\") pod \"node-resolver-7llkw\" (UID: \"89f3f27a-83eb-4cd9-b557-aeee15998793\") " pod="openshift-dns/node-resolver-7llkw" Mar 19 09:24:28.167681 master-0 kubenswrapper[13205]: I0319 09:24:28.167630 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-7d58488df-sf92q"] Mar 19 09:24:28.168447 master-0 kubenswrapper[13205]: I0319 09:24:28.168408 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.170140 master-0 kubenswrapper[13205]: I0319 09:24:28.170093 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gfzzh" Mar 19 09:24:28.170242 master-0 kubenswrapper[13205]: I0319 09:24:28.170108 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:24:28.170319 master-0 kubenswrapper[13205]: I0319 09:24:28.170292 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:24:28.171885 master-0 kubenswrapper[13205]: I0319 09:24:28.171856 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:24:28.231960 master-0 kubenswrapper[13205]: I0319 09:24:28.231914 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1fac5afc-b7d8-4cc5-9d18-898ed3125320-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.231960 master-0 kubenswrapper[13205]: I0319 09:24:28.231968 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1fac5afc-b7d8-4cc5-9d18-898ed3125320-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.232196 master-0 kubenswrapper[13205]: I0319 09:24:28.232044 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1fac5afc-b7d8-4cc5-9d18-898ed3125320-kube-api-access\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.232196 master-0 kubenswrapper[13205]: I0319 09:24:28.232092 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fac5afc-b7d8-4cc5-9d18-898ed3125320-serving-cert\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.232196 master-0 kubenswrapper[13205]: I0319 09:24:28.232109 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fac5afc-b7d8-4cc5-9d18-898ed3125320-service-ca\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.260676 master-0 kubenswrapper[13205]: I0319 09:24:28.260591 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7llkw" Mar 19 09:24:28.335287 master-0 kubenswrapper[13205]: I0319 09:24:28.334578 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fac5afc-b7d8-4cc5-9d18-898ed3125320-serving-cert\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.335287 master-0 kubenswrapper[13205]: I0319 09:24:28.334663 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fac5afc-b7d8-4cc5-9d18-898ed3125320-service-ca\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.335287 master-0 kubenswrapper[13205]: I0319 09:24:28.334751 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1fac5afc-b7d8-4cc5-9d18-898ed3125320-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.335287 master-0 kubenswrapper[13205]: I0319 09:24:28.334790 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1fac5afc-b7d8-4cc5-9d18-898ed3125320-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.335287 master-0 kubenswrapper[13205]: I0319 09:24:28.334906 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1fac5afc-b7d8-4cc5-9d18-898ed3125320-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.335287 master-0 kubenswrapper[13205]: I0319 09:24:28.335206 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1fac5afc-b7d8-4cc5-9d18-898ed3125320-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.336053 master-0 kubenswrapper[13205]: I0319 09:24:28.335353 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1fac5afc-b7d8-4cc5-9d18-898ed3125320-kube-api-access\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.336280 master-0 kubenswrapper[13205]: I0319 09:24:28.336230 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fac5afc-b7d8-4cc5-9d18-898ed3125320-service-ca\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.337681 master-0 kubenswrapper[13205]: I0319 09:24:28.337661 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fac5afc-b7d8-4cc5-9d18-898ed3125320-serving-cert\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.486990 master-0 kubenswrapper[13205]: I0319 09:24:28.486891 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1fac5afc-b7d8-4cc5-9d18-898ed3125320-kube-api-access\") pod \"cluster-version-operator-7d58488df-sf92q\" (UID: \"1fac5afc-b7d8-4cc5-9d18-898ed3125320\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.494419 master-0 kubenswrapper[13205]: I0319 09:24:28.494386 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" Mar 19 09:24:28.644446 master-0 kubenswrapper[13205]: W0319 09:24:28.644396 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0f93ac_a77b_488a_bcc4_a45702a9e32d.slice/crio-c222b28cdd33bcddeb7cde7c320a3a3a636e477f92e173bb78823f9c2f912212 WatchSource:0}: Error finding container c222b28cdd33bcddeb7cde7c320a3a3a636e477f92e173bb78823f9c2f912212: Status 404 returned error can't find the container with id c222b28cdd33bcddeb7cde7c320a3a3a636e477f92e173bb78823f9c2f912212 Mar 19 09:24:28.644940 master-0 kubenswrapper[13205]: W0319 09:24:28.644907 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a0424dd_a9fc_4763_b30c_884076dd64aa.slice/crio-efdf1c5884cc651d4f128072734d61d88f943d44604340ad56203b663ad048c8 WatchSource:0}: Error finding container efdf1c5884cc651d4f128072734d61d88f943d44604340ad56203b663ad048c8: Status 404 returned error can't find the container with id efdf1c5884cc651d4f128072734d61d88f943d44604340ad56203b663ad048c8 Mar 19 09:24:28.645884 master-0 kubenswrapper[13205]: W0319 09:24:28.645731 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc10d0e00_cf19_4067_b7bf_ff569f2f3d71.slice/crio-8d5a3c4d3c299dc743d134196623d4826c224d87b5cbb387408bbb500fb4de71 WatchSource:0}: Error finding container 8d5a3c4d3c299dc743d134196623d4826c224d87b5cbb387408bbb500fb4de71: Status 404 returned error can't find the container with id 8d5a3c4d3c299dc743d134196623d4826c224d87b5cbb387408bbb500fb4de71 Mar 19 09:24:28.857742 master-0 kubenswrapper[13205]: I0319 09:24:28.857657 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51b88818-5108-40db-90c8-4f2e7198959e" path="/var/lib/kubelet/pods/51b88818-5108-40db-90c8-4f2e7198959e/volumes" Mar 19 09:24:28.942051 master-0 kubenswrapper[13205]: I0319 09:24:28.941964 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:28.942256 master-0 kubenswrapper[13205]: E0319 09:24:28.942180 13205 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 19 09:24:28.942301 master-0 kubenswrapper[13205]: E0319 09:24:28.942278 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls podName:745093e5-ffe1-4443-b317-448948f3b311 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:36.942249825 +0000 UTC m=+62.274556773 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls") pod "dns-default-79jrh" (UID: "745093e5-ffe1-4443-b317-448948f3b311") : secret "dns-default-metrics-tls" not found Mar 19 09:24:29.347971 master-0 kubenswrapper[13205]: I0319 09:24:29.347921 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cvtc\" (UniqueName: \"kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:29.350823 master-0 kubenswrapper[13205]: I0319 09:24:29.350770 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cvtc\" (UniqueName: \"kubernetes.io/projected/745093e5-ffe1-4443-b317-448948f3b311-kube-api-access-5cvtc\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:29.468420 master-0 kubenswrapper[13205]: I0319 09:24:29.468371 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" event={"ID":"6a0424dd-a9fc-4763-b30c-884076dd64aa","Type":"ContainerStarted","Data":"efdf1c5884cc651d4f128072734d61d88f943d44604340ad56203b663ad048c8"} Mar 19 09:24:29.469496 master-0 kubenswrapper[13205]: I0319 09:24:29.469464 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" event={"ID":"9a0f93ac-a77b-488a-bcc4-a45702a9e32d","Type":"ContainerStarted","Data":"c222b28cdd33bcddeb7cde7c320a3a3a636e477f92e173bb78823f9c2f912212"} Mar 19 09:24:29.470366 master-0 kubenswrapper[13205]: I0319 09:24:29.470312 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" event={"ID":"c10d0e00-cf19-4067-b7bf-ff569f2f3d71","Type":"ContainerStarted","Data":"8d5a3c4d3c299dc743d134196623d4826c224d87b5cbb387408bbb500fb4de71"} Mar 19 09:24:31.403855 master-0 kubenswrapper[13205]: I0319 09:24:31.403685 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-64f78496dd-kwdfq"] Mar 19 09:24:31.727004 master-0 kubenswrapper[13205]: I0319 09:24:31.726892 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:24:31.727176 master-0 kubenswrapper[13205]: I0319 09:24:31.727137 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="95378a840215d5780aa88df876aac909" containerName="startup-monitor" containerID="cri-o://6515162c88f5a0e7101bd8f4c9ab9f4bbb0fb9d6b63a2db99d70609588290bb3" gracePeriod=5 Mar 19 09:24:31.956989 master-0 kubenswrapper[13205]: W0319 09:24:31.956939 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod165e3498_b49e_42fa_a614_0680f8c93fc7.slice/crio-195239aec99a51347cc4ba01a93baf5053c1f4e8b0de50811fedcb003dfea0c8 WatchSource:0}: Error finding container 195239aec99a51347cc4ba01a93baf5053c1f4e8b0de50811fedcb003dfea0c8: Status 404 returned error can't find the container with id 195239aec99a51347cc4ba01a93baf5053c1f4e8b0de50811fedcb003dfea0c8 Mar 19 09:24:32.491970 master-0 kubenswrapper[13205]: I0319 09:24:32.491911 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" event={"ID":"1fac5afc-b7d8-4cc5-9d18-898ed3125320","Type":"ContainerStarted","Data":"92e9e4de6887b474e99aef3df196cdd2367b1ac902beaaf8871036c0aae2e996"} Mar 19 09:24:32.493704 master-0 kubenswrapper[13205]: I0319 09:24:32.493644 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-64f78496dd-kwdfq" event={"ID":"165e3498-b49e-42fa-a614-0680f8c93fc7","Type":"ContainerStarted","Data":"195239aec99a51347cc4ba01a93baf5053c1f4e8b0de50811fedcb003dfea0c8"} Mar 19 09:24:33.345871 master-0 kubenswrapper[13205]: I0319 09:24:33.345332 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw"] Mar 19 09:24:34.806732 master-0 kubenswrapper[13205]: I0319 09:24:34.806677 13205 scope.go:117] "RemoveContainer" containerID="d486a2c521f4c2c3eb232b1929f8a1ec255878f2382227f7f128e10063843ecc" Mar 19 09:24:35.204200 master-0 kubenswrapper[13205]: E0319 09:24:35.204105 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:24:36.956817 master-0 kubenswrapper[13205]: I0319 09:24:36.956733 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:36.957884 master-0 kubenswrapper[13205]: E0319 09:24:36.956867 13205 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 19 09:24:36.957884 master-0 kubenswrapper[13205]: E0319 09:24:36.956946 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls podName:745093e5-ffe1-4443-b317-448948f3b311 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:52.956928818 +0000 UTC m=+78.289235706 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls") pod "dns-default-79jrh" (UID: "745093e5-ffe1-4443-b317-448948f3b311") : secret "dns-default-metrics-tls" not found Mar 19 09:24:36.996066 master-0 kubenswrapper[13205]: W0319 09:24:36.995955 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod588cf947_93a7_4e1d_b2fe_a281cb4eb44e.slice/crio-9f71fc68a9d592a66c91ebdec06399c18da5168e78a14720fb46026981f532ae WatchSource:0}: Error finding container 9f71fc68a9d592a66c91ebdec06399c18da5168e78a14720fb46026981f532ae: Status 404 returned error can't find the container with id 9f71fc68a9d592a66c91ebdec06399c18da5168e78a14720fb46026981f532ae Mar 19 09:24:37.072470 master-0 kubenswrapper[13205]: W0319 09:24:37.072401 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89f3f27a_83eb_4cd9_b557_aeee15998793.slice/crio-6aca6771dec5a6858fc870db8a0b229423d12909bfc25fde4476982c143e2af5 WatchSource:0}: Error finding container 6aca6771dec5a6858fc870db8a0b229423d12909bfc25fde4476982c143e2af5: Status 404 returned error can't find the container with id 6aca6771dec5a6858fc870db8a0b229423d12909bfc25fde4476982c143e2af5 Mar 19 09:24:37.135835 master-0 kubenswrapper[13205]: I0319 09:24:37.135768 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_95378a840215d5780aa88df876aac909/startup-monitor/0.log" Mar 19 09:24:37.135835 master-0 kubenswrapper[13205]: I0319 09:24:37.135843 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:37.158614 master-0 kubenswrapper[13205]: I0319 09:24:37.158575 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-lock\") pod \"95378a840215d5780aa88df876aac909\" (UID: \"95378a840215d5780aa88df876aac909\") " Mar 19 09:24:37.158744 master-0 kubenswrapper[13205]: I0319 09:24:37.158700 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-log\") pod \"95378a840215d5780aa88df876aac909\" (UID: \"95378a840215d5780aa88df876aac909\") " Mar 19 09:24:37.158834 master-0 kubenswrapper[13205]: I0319 09:24:37.158686 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-lock" (OuterVolumeSpecName: "var-lock") pod "95378a840215d5780aa88df876aac909" (UID: "95378a840215d5780aa88df876aac909"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:37.158834 master-0 kubenswrapper[13205]: I0319 09:24:37.158748 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-log" (OuterVolumeSpecName: "var-log") pod "95378a840215d5780aa88df876aac909" (UID: "95378a840215d5780aa88df876aac909"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:37.158926 master-0 kubenswrapper[13205]: I0319 09:24:37.158907 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-resource-dir\") pod \"95378a840215d5780aa88df876aac909\" (UID: \"95378a840215d5780aa88df876aac909\") " Mar 19 09:24:37.158962 master-0 kubenswrapper[13205]: I0319 09:24:37.158934 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-pod-resource-dir\") pod \"95378a840215d5780aa88df876aac909\" (UID: \"95378a840215d5780aa88df876aac909\") " Mar 19 09:24:37.158993 master-0 kubenswrapper[13205]: I0319 09:24:37.158970 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-manifests\") pod \"95378a840215d5780aa88df876aac909\" (UID: \"95378a840215d5780aa88df876aac909\") " Mar 19 09:24:37.159089 master-0 kubenswrapper[13205]: I0319 09:24:37.159044 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95378a840215d5780aa88df876aac909-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "95378a840215d5780aa88df876aac909" (UID: "95378a840215d5780aa88df876aac909"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:37.159187 master-0 kubenswrapper[13205]: I0319 09:24:37.159158 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95378a840215d5780aa88df876aac909-manifests" (OuterVolumeSpecName: "manifests") pod "95378a840215d5780aa88df876aac909" (UID: "95378a840215d5780aa88df876aac909"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:37.162600 master-0 kubenswrapper[13205]: I0319 09:24:37.159955 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:37.162600 master-0 kubenswrapper[13205]: I0319 09:24:37.159980 13205 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:37.162600 master-0 kubenswrapper[13205]: I0319 09:24:37.159992 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:37.162600 master-0 kubenswrapper[13205]: I0319 09:24:37.160002 13205 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:37.167724 master-0 kubenswrapper[13205]: I0319 09:24:37.166939 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95378a840215d5780aa88df876aac909-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "95378a840215d5780aa88df876aac909" (UID: "95378a840215d5780aa88df876aac909"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:37.261495 master-0 kubenswrapper[13205]: I0319 09:24:37.261381 13205 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/95378a840215d5780aa88df876aac909-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:37.527892 master-0 kubenswrapper[13205]: I0319 09:24:37.527832 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" event={"ID":"c10d0e00-cf19-4067-b7bf-ff569f2f3d71","Type":"ContainerStarted","Data":"fbbd3aa9e5aceaabbdd12723e84b8c4bbbd103e03699120d0ea6b05c25a39df5"} Mar 19 09:24:37.533732 master-0 kubenswrapper[13205]: I0319 09:24:37.533641 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7llkw" event={"ID":"89f3f27a-83eb-4cd9-b557-aeee15998793","Type":"ContainerStarted","Data":"6aca6771dec5a6858fc870db8a0b229423d12909bfc25fde4476982c143e2af5"} Mar 19 09:24:37.536061 master-0 kubenswrapper[13205]: I0319 09:24:37.536044 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_95378a840215d5780aa88df876aac909/startup-monitor/0.log" Mar 19 09:24:37.536140 master-0 kubenswrapper[13205]: I0319 09:24:37.536079 13205 generic.go:334] "Generic (PLEG): container finished" podID="95378a840215d5780aa88df876aac909" containerID="6515162c88f5a0e7101bd8f4c9ab9f4bbb0fb9d6b63a2db99d70609588290bb3" exitCode=137 Mar 19 09:24:37.536140 master-0 kubenswrapper[13205]: I0319 09:24:37.536129 13205 scope.go:117] "RemoveContainer" containerID="6515162c88f5a0e7101bd8f4c9ab9f4bbb0fb9d6b63a2db99d70609588290bb3" Mar 19 09:24:37.536227 master-0 kubenswrapper[13205]: I0319 09:24:37.536213 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:37.538462 master-0 kubenswrapper[13205]: I0319 09:24:37.538411 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" event={"ID":"6a0424dd-a9fc-4763-b30c-884076dd64aa","Type":"ContainerStarted","Data":"e4418506d2c959460708a3270ba8291737f8468b8e6a075cc4be368f4f83bad0"} Mar 19 09:24:37.540681 master-0 kubenswrapper[13205]: I0319 09:24:37.540640 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" event={"ID":"588cf947-93a7-4e1d-b2fe-a281cb4eb44e","Type":"ContainerStarted","Data":"9f71fc68a9d592a66c91ebdec06399c18da5168e78a14720fb46026981f532ae"} Mar 19 09:24:37.571631 master-0 kubenswrapper[13205]: I0319 09:24:37.571582 13205 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="2b47a5ec-0fe2-49cd-9f9d-6893a5f3b416" Mar 19 09:24:38.752608 master-0 kubenswrapper[13205]: I0319 09:24:38.752500 13205 scope.go:117] "RemoveContainer" containerID="6515162c88f5a0e7101bd8f4c9ab9f4bbb0fb9d6b63a2db99d70609588290bb3" Mar 19 09:24:38.752923 master-0 kubenswrapper[13205]: E0319 09:24:38.752877 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6515162c88f5a0e7101bd8f4c9ab9f4bbb0fb9d6b63a2db99d70609588290bb3\": container with ID starting with 6515162c88f5a0e7101bd8f4c9ab9f4bbb0fb9d6b63a2db99d70609588290bb3 not found: ID does not exist" containerID="6515162c88f5a0e7101bd8f4c9ab9f4bbb0fb9d6b63a2db99d70609588290bb3" Mar 19 09:24:38.752959 master-0 kubenswrapper[13205]: I0319 09:24:38.752913 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6515162c88f5a0e7101bd8f4c9ab9f4bbb0fb9d6b63a2db99d70609588290bb3"} err="failed to get container status \"6515162c88f5a0e7101bd8f4c9ab9f4bbb0fb9d6b63a2db99d70609588290bb3\": rpc error: code = NotFound desc = could not find container \"6515162c88f5a0e7101bd8f4c9ab9f4bbb0fb9d6b63a2db99d70609588290bb3\": container with ID starting with 6515162c88f5a0e7101bd8f4c9ab9f4bbb0fb9d6b63a2db99d70609588290bb3 not found: ID does not exist" Mar 19 09:24:38.859706 master-0 kubenswrapper[13205]: I0319 09:24:38.859658 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95378a840215d5780aa88df876aac909" path="/var/lib/kubelet/pods/95378a840215d5780aa88df876aac909/volumes" Mar 19 09:24:38.859966 master-0 kubenswrapper[13205]: I0319 09:24:38.859941 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 19 09:24:39.324526 master-0 kubenswrapper[13205]: I0319 09:24:39.324183 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:24:39.324526 master-0 kubenswrapper[13205]: I0319 09:24:39.324250 13205 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="2b47a5ec-0fe2-49cd-9f9d-6893a5f3b416" Mar 19 09:24:39.329192 master-0 kubenswrapper[13205]: I0319 09:24:39.328838 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:24:39.329192 master-0 kubenswrapper[13205]: I0319 09:24:39.328882 13205 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="2b47a5ec-0fe2-49cd-9f9d-6893a5f3b416" Mar 19 09:24:40.567620 master-0 kubenswrapper[13205]: I0319 09:24:40.567522 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" event={"ID":"588cf947-93a7-4e1d-b2fe-a281cb4eb44e","Type":"ContainerStarted","Data":"c10834797ecf23d0fb63a828ab647a6731c0c1feef9db3070a96ba44614bc512"} Mar 19 09:24:40.569754 master-0 kubenswrapper[13205]: I0319 09:24:40.569689 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" event={"ID":"1fac5afc-b7d8-4cc5-9d18-898ed3125320","Type":"ContainerStarted","Data":"bc3cbecc659c46baa7725c5741074b8451483d23c97d6115b60d366d1118fb2b"} Mar 19 09:24:41.577633 master-0 kubenswrapper[13205]: I0319 09:24:41.577584 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" event={"ID":"6904be4c-4f5f-4176-8100-7b6955c6d8da","Type":"ContainerStarted","Data":"7a13e386375bce18c61ee7b6a70f8617363b05476b2f78b3d5995802bea93bef"} Mar 19 09:24:41.579213 master-0 kubenswrapper[13205]: I0319 09:24:41.579176 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j" event={"ID":"0e517f77-ec32-4376-b1ff-88ec24a22e3e","Type":"ContainerStarted","Data":"e9bdfaba2815422b6a6ae9f8c8d3ce7c15ddb5bac25cfe6da0c4b53d6e31dc9d"} Mar 19 09:24:41.581085 master-0 kubenswrapper[13205]: I0319 09:24:41.580767 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" event={"ID":"9a0f93ac-a77b-488a-bcc4-a45702a9e32d","Type":"ContainerStarted","Data":"a80231b434755caf6695f3beee3129592e0b9172da3c6519260dc101567b4d3d"} Mar 19 09:24:41.582814 master-0 kubenswrapper[13205]: I0319 09:24:41.582712 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-64f78496dd-kwdfq" event={"ID":"165e3498-b49e-42fa-a614-0680f8c93fc7","Type":"ContainerStarted","Data":"3d073454703b98d627688eb3e503094b729bd8ecbb23322adf3bbf3ada8aa266"} Mar 19 09:24:41.584495 master-0 kubenswrapper[13205]: I0319 09:24:41.584404 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" event={"ID":"87b757ff-ca45-4dc7-b31f-ccca53cb2354","Type":"ContainerStarted","Data":"054cdc5e26bc9173015636411bf16495940c7ff09c3cacff260412b26b73df36"} Mar 19 09:24:41.586657 master-0 kubenswrapper[13205]: I0319 09:24:41.586573 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" event={"ID":"e8ca673b-2a2f-4ecf-a142-7fe10fcac707","Type":"ContainerStarted","Data":"6028345572a8ef50a1435ba40a05eb5b44cff27aa19ce4a85fcd0e6d16aac231"} Mar 19 09:24:41.588937 master-0 kubenswrapper[13205]: I0319 09:24:41.588893 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" event={"ID":"4869583f-43af-4ec9-8dea-1da1634816dc","Type":"ContainerStarted","Data":"abbe612c1c855808efb4115b9f8463a26547cb583d0a0640fd7246247f2384b0"} Mar 19 09:24:41.593215 master-0 kubenswrapper[13205]: I0319 09:24:41.593174 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq" event={"ID":"ff50023c-0f3f-4506-b26f-9872d0eec45e","Type":"ContainerStarted","Data":"f1753e78da15a3d4a97258ebd31a0fb317f9b62db8bcefd5410f6b22db8ba094"} Mar 19 09:24:41.595327 master-0 kubenswrapper[13205]: I0319 09:24:41.595181 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7llkw" event={"ID":"89f3f27a-83eb-4cd9-b557-aeee15998793","Type":"ContainerStarted","Data":"20a1aea898ac89756bb80ad13f4a22a9161e06ac9fe5e50c77717e00fdd00155"} Mar 19 09:24:41.597517 master-0 kubenswrapper[13205]: I0319 09:24:41.597471 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" event={"ID":"6a0424dd-a9fc-4763-b30c-884076dd64aa","Type":"ContainerStarted","Data":"b3d11cc1e87ffe57353d61f5deaefe3ae72275d3c18bd432f3d91cf48c13642d"} Mar 19 09:24:41.600054 master-0 kubenswrapper[13205]: I0319 09:24:41.599877 13205 generic.go:334] "Generic (PLEG): container finished" podID="7c70267e-b555-4d56-92e4-f24b65b61283" containerID="05498e0526ba378a7e6fb920e08be99968ac0daa7390ba18866ec2b19daf67a5" exitCode=0 Mar 19 09:24:41.600054 master-0 kubenswrapper[13205]: I0319 09:24:41.599965 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" event={"ID":"7c70267e-b555-4d56-92e4-f24b65b61283","Type":"ContainerDied","Data":"05498e0526ba378a7e6fb920e08be99968ac0daa7390ba18866ec2b19daf67a5"} Mar 19 09:24:41.601924 master-0 kubenswrapper[13205]: I0319 09:24:41.601812 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" event={"ID":"9e10cb6e-5703-4e4d-a82b-f6de34888b65","Type":"ContainerStarted","Data":"db787764635a7c7132d16869134e3fbb91501f635bc88c71e3e87ec410b2b532"} Mar 19 09:24:41.604843 master-0 kubenswrapper[13205]: I0319 09:24:41.604551 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" event={"ID":"6775d7ec-8114-4fc3-a23d-d5ac910f3285","Type":"ContainerStarted","Data":"fe29b1bfeb7d6f2cfe52b8699973cde4bbb2e3f1db55a999f72015938d708df8"} Mar 19 09:24:41.604942 master-0 kubenswrapper[13205]: I0319 09:24:41.604839 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:41.922056 master-0 kubenswrapper[13205]: I0319 09:24:41.920186 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-7d58488df-sf92q" podStartSLOduration=13.920166368 podStartE2EDuration="13.920166368s" podCreationTimestamp="2026-03-19 09:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:24:41.917296749 +0000 UTC m=+67.249603637" watchObservedRunningTime="2026-03-19 09:24:41.920166368 +0000 UTC m=+67.252473276" Mar 19 09:24:42.028125 master-0 kubenswrapper[13205]: I0319 09:24:42.027391 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-zgn8x" podStartSLOduration=7.295230504 podStartE2EDuration="21.027375264s" podCreationTimestamp="2026-03-19 09:24:21 +0000 UTC" firstStartedPulling="2026-03-19 09:24:25.020365532 +0000 UTC m=+50.352672420" lastFinishedPulling="2026-03-19 09:24:38.752510292 +0000 UTC m=+64.084817180" observedRunningTime="2026-03-19 09:24:42.026473801 +0000 UTC m=+67.358780709" watchObservedRunningTime="2026-03-19 09:24:42.027375264 +0000 UTC m=+67.359682152" Mar 19 09:24:42.215864 master-0 kubenswrapper[13205]: I0319 09:24:42.215743 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" Mar 19 09:24:42.219685 master-0 kubenswrapper[13205]: I0319 09:24:42.219144 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-6f5bddd45b-hzcnw" podStartSLOduration=16.219129095 podStartE2EDuration="16.219129095s" podCreationTimestamp="2026-03-19 09:24:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:24:42.177924357 +0000 UTC m=+67.510231245" watchObservedRunningTime="2026-03-19 09:24:42.219129095 +0000 UTC m=+67.551435983" Mar 19 09:24:42.220332 master-0 kubenswrapper[13205]: I0319 09:24:42.220288 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" podStartSLOduration=7.35242721 podStartE2EDuration="19.220279703s" podCreationTimestamp="2026-03-19 09:24:23 +0000 UTC" firstStartedPulling="2026-03-19 09:24:26.884771092 +0000 UTC m=+52.217077980" lastFinishedPulling="2026-03-19 09:24:38.752623585 +0000 UTC m=+64.084930473" observedRunningTime="2026-03-19 09:24:42.219121735 +0000 UTC m=+67.551428623" watchObservedRunningTime="2026-03-19 09:24:42.220279703 +0000 UTC m=+67.552586591" Mar 19 09:24:42.607832 master-0 kubenswrapper[13205]: I0319 09:24:42.607658 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" podStartSLOduration=7.946234794 podStartE2EDuration="19.607637569s" podCreationTimestamp="2026-03-19 09:24:23 +0000 UTC" firstStartedPulling="2026-03-19 09:24:28.647207934 +0000 UTC m=+53.979514822" lastFinishedPulling="2026-03-19 09:24:40.308610709 +0000 UTC m=+65.640917597" observedRunningTime="2026-03-19 09:24:42.607578228 +0000 UTC m=+67.939885136" watchObservedRunningTime="2026-03-19 09:24:42.607637569 +0000 UTC m=+67.939944457" Mar 19 09:24:42.623598 master-0 kubenswrapper[13205]: I0319 09:24:42.623527 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" event={"ID":"87b757ff-ca45-4dc7-b31f-ccca53cb2354","Type":"ContainerStarted","Data":"46c4ff53cd5ebfcee57e9a6606cf66a143c03b3841436ec9e8f675f9c3ddbef2"} Mar 19 09:24:42.630577 master-0 kubenswrapper[13205]: I0319 09:24:42.630514 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" event={"ID":"9e10cb6e-5703-4e4d-a82b-f6de34888b65","Type":"ContainerStarted","Data":"041e723992540922436e90b5d9095bce07f4956cd2e2523edb1bcdbcbea31c25"} Mar 19 09:24:42.646075 master-0 kubenswrapper[13205]: I0319 09:24:42.646036 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j" event={"ID":"0e517f77-ec32-4376-b1ff-88ec24a22e3e","Type":"ContainerStarted","Data":"758dac38fff0f5a17246c5019651d59f791dfb560b591e2a9a819daf8b380f64"} Mar 19 09:24:42.655803 master-0 kubenswrapper[13205]: I0319 09:24:42.655734 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-64f78496dd-kwdfq" event={"ID":"165e3498-b49e-42fa-a614-0680f8c93fc7","Type":"ContainerStarted","Data":"b71b83d41575edf98052e4b078a3795d73b01b50c27475f2b3aaf5a4394c311b"} Mar 19 09:24:42.778594 master-0 kubenswrapper[13205]: I0319 09:24:42.774048 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-lvh8s" podStartSLOduration=18.774029487 podStartE2EDuration="18.774029487s" podCreationTimestamp="2026-03-19 09:24:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:24:42.730756719 +0000 UTC m=+68.063063607" watchObservedRunningTime="2026-03-19 09:24:42.774029487 +0000 UTC m=+68.106336375" Mar 19 09:24:42.783642 master-0 kubenswrapper[13205]: I0319 09:24:42.776511 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-n7hxq" podStartSLOduration=14.163384914 podStartE2EDuration="21.776503327s" podCreationTimestamp="2026-03-19 09:24:21 +0000 UTC" firstStartedPulling="2026-03-19 09:24:24.344610075 +0000 UTC m=+49.676916953" lastFinishedPulling="2026-03-19 09:24:31.957728478 +0000 UTC m=+57.290035366" observedRunningTime="2026-03-19 09:24:42.770976323 +0000 UTC m=+68.103283211" watchObservedRunningTime="2026-03-19 09:24:42.776503327 +0000 UTC m=+68.108810215" Mar 19 09:24:42.783642 master-0 kubenswrapper[13205]: I0319 09:24:42.778548 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-nbj7j"] Mar 19 09:24:42.783642 master-0 kubenswrapper[13205]: E0319 09:24:42.779351 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95378a840215d5780aa88df876aac909" containerName="startup-monitor" Mar 19 09:24:42.783642 master-0 kubenswrapper[13205]: I0319 09:24:42.779368 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="95378a840215d5780aa88df876aac909" containerName="startup-monitor" Mar 19 09:24:42.783642 master-0 kubenswrapper[13205]: I0319 09:24:42.779520 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="95378a840215d5780aa88df876aac909" containerName="startup-monitor" Mar 19 09:24:42.804899 master-0 kubenswrapper[13205]: I0319 09:24:42.799758 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:42.809600 master-0 kubenswrapper[13205]: I0319 09:24:42.806106 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 09:24:42.934131 master-0 kubenswrapper[13205]: I0319 09:24:42.933145 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7llkw" podStartSLOduration=15.933088437 podStartE2EDuration="15.933088437s" podCreationTimestamp="2026-03-19 09:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:24:42.926861776 +0000 UTC m=+68.259168664" watchObservedRunningTime="2026-03-19 09:24:42.933088437 +0000 UTC m=+68.265395325" Mar 19 09:24:42.947064 master-0 kubenswrapper[13205]: I0319 09:24:42.947011 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fd1bd769-25e2-4b9c-a112-28e2c624f9b7-mcd-auth-proxy-config\") pod \"machine-config-daemon-nbj7j\" (UID: \"fd1bd769-25e2-4b9c-a112-28e2c624f9b7\") " pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:42.947225 master-0 kubenswrapper[13205]: I0319 09:24:42.947098 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd1bd769-25e2-4b9c-a112-28e2c624f9b7-proxy-tls\") pod \"machine-config-daemon-nbj7j\" (UID: \"fd1bd769-25e2-4b9c-a112-28e2c624f9b7\") " pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:42.947225 master-0 kubenswrapper[13205]: I0319 09:24:42.947129 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fd1bd769-25e2-4b9c-a112-28e2c624f9b7-rootfs\") pod \"machine-config-daemon-nbj7j\" (UID: \"fd1bd769-25e2-4b9c-a112-28e2c624f9b7\") " pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:42.947225 master-0 kubenswrapper[13205]: I0319 09:24:42.947146 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s26ft\" (UniqueName: \"kubernetes.io/projected/fd1bd769-25e2-4b9c-a112-28e2c624f9b7-kube-api-access-s26ft\") pod \"machine-config-daemon-nbj7j\" (UID: \"fd1bd769-25e2-4b9c-a112-28e2c624f9b7\") " pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:43.044050 master-0 kubenswrapper[13205]: I0319 09:24:43.043933 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-68bf6ff9d6-4qq6m" podStartSLOduration=8.085590871 podStartE2EDuration="21.04391157s" podCreationTimestamp="2026-03-19 09:24:22 +0000 UTC" firstStartedPulling="2026-03-19 09:24:25.785336489 +0000 UTC m=+51.117643377" lastFinishedPulling="2026-03-19 09:24:38.743657188 +0000 UTC m=+64.075964076" observedRunningTime="2026-03-19 09:24:43.041142553 +0000 UTC m=+68.373449451" watchObservedRunningTime="2026-03-19 09:24:43.04391157 +0000 UTC m=+68.376218458" Mar 19 09:24:43.049992 master-0 kubenswrapper[13205]: I0319 09:24:43.049953 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd1bd769-25e2-4b9c-a112-28e2c624f9b7-proxy-tls\") pod \"machine-config-daemon-nbj7j\" (UID: \"fd1bd769-25e2-4b9c-a112-28e2c624f9b7\") " pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:43.050109 master-0 kubenswrapper[13205]: I0319 09:24:43.050021 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fd1bd769-25e2-4b9c-a112-28e2c624f9b7-rootfs\") pod \"machine-config-daemon-nbj7j\" (UID: \"fd1bd769-25e2-4b9c-a112-28e2c624f9b7\") " pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:43.050109 master-0 kubenswrapper[13205]: I0319 09:24:43.050052 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s26ft\" (UniqueName: \"kubernetes.io/projected/fd1bd769-25e2-4b9c-a112-28e2c624f9b7-kube-api-access-s26ft\") pod \"machine-config-daemon-nbj7j\" (UID: \"fd1bd769-25e2-4b9c-a112-28e2c624f9b7\") " pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:43.050109 master-0 kubenswrapper[13205]: I0319 09:24:43.050103 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fd1bd769-25e2-4b9c-a112-28e2c624f9b7-mcd-auth-proxy-config\") pod \"machine-config-daemon-nbj7j\" (UID: \"fd1bd769-25e2-4b9c-a112-28e2c624f9b7\") " pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:43.050212 master-0 kubenswrapper[13205]: I0319 09:24:43.050108 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/fd1bd769-25e2-4b9c-a112-28e2c624f9b7-rootfs\") pod \"machine-config-daemon-nbj7j\" (UID: \"fd1bd769-25e2-4b9c-a112-28e2c624f9b7\") " pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:43.051174 master-0 kubenswrapper[13205]: I0319 09:24:43.050805 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fd1bd769-25e2-4b9c-a112-28e2c624f9b7-mcd-auth-proxy-config\") pod \"machine-config-daemon-nbj7j\" (UID: \"fd1bd769-25e2-4b9c-a112-28e2c624f9b7\") " pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:43.064649 master-0 kubenswrapper[13205]: I0319 09:24:43.054841 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd1bd769-25e2-4b9c-a112-28e2c624f9b7-proxy-tls\") pod \"machine-config-daemon-nbj7j\" (UID: \"fd1bd769-25e2-4b9c-a112-28e2c624f9b7\") " pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:43.189972 master-0 kubenswrapper[13205]: I0319 09:24:43.178627 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s26ft\" (UniqueName: \"kubernetes.io/projected/fd1bd769-25e2-4b9c-a112-28e2c624f9b7-kube-api-access-s26ft\") pod \"machine-config-daemon-nbj7j\" (UID: \"fd1bd769-25e2-4b9c-a112-28e2c624f9b7\") " pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:43.427993 master-0 kubenswrapper[13205]: I0319 09:24:43.426151 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" Mar 19 09:24:43.429645 master-0 kubenswrapper[13205]: I0319 09:24:43.429599 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" podStartSLOduration=6.711332531 podStartE2EDuration="20.429589585s" podCreationTimestamp="2026-03-19 09:24:23 +0000 UTC" firstStartedPulling="2026-03-19 09:24:26.422046511 +0000 UTC m=+51.754353389" lastFinishedPulling="2026-03-19 09:24:40.140303555 +0000 UTC m=+65.472610443" observedRunningTime="2026-03-19 09:24:43.427913755 +0000 UTC m=+68.760220643" watchObservedRunningTime="2026-03-19 09:24:43.429589585 +0000 UTC m=+68.761896473" Mar 19 09:24:43.668872 master-0 kubenswrapper[13205]: I0319 09:24:43.668009 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" event={"ID":"fd1bd769-25e2-4b9c-a112-28e2c624f9b7","Type":"ContainerStarted","Data":"8d0ca3f4a6299e57b7735d5a697824e77e5e99eccc3d0b449a37138c0f716071"} Mar 19 09:24:43.668872 master-0 kubenswrapper[13205]: I0319 09:24:43.668060 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" event={"ID":"fd1bd769-25e2-4b9c-a112-28e2c624f9b7","Type":"ContainerStarted","Data":"3b5f85fec0e9c06246e27fce0276318b8a1283159e1afef24570d665b9fe5c4a"} Mar 19 09:24:43.677606 master-0 kubenswrapper[13205]: I0319 09:24:43.676376 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" event={"ID":"9e10cb6e-5703-4e4d-a82b-f6de34888b65","Type":"ContainerStarted","Data":"3c4cf99c0bc54cc07a19d2958f1301339fb7df219a6cda7d07143443f33f3b4a"} Mar 19 09:24:43.798069 master-0 kubenswrapper[13205]: I0319 09:24:43.797387 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-64f78496dd-kwdfq" podStartSLOduration=18.797373118 podStartE2EDuration="18.797373118s" podCreationTimestamp="2026-03-19 09:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:24:43.646909796 +0000 UTC m=+68.979216684" watchObservedRunningTime="2026-03-19 09:24:43.797373118 +0000 UTC m=+69.129680006" Mar 19 09:24:43.837226 master-0 kubenswrapper[13205]: I0319 09:24:43.837112 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d"] Mar 19 09:24:43.837409 master-0 kubenswrapper[13205]: I0319 09:24:43.837330 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" podUID="8beda3a0-a653-4810-b3f2-d25badb21ab1" containerName="multus-admission-controller" containerID="cri-o://ec2a8f37c4a4bf290761beb86b8148cabc7c9a7b8241accf763dd14e9ad11acc" gracePeriod=30 Mar 19 09:24:43.837578 master-0 kubenswrapper[13205]: I0319 09:24:43.837478 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" podUID="8beda3a0-a653-4810-b3f2-d25badb21ab1" containerName="kube-rbac-proxy" containerID="cri-o://5a1232e74d2b81fa0fb089837e46ec811c58ea20165c36d4de9800956bf481df" gracePeriod=30 Mar 19 09:24:43.916194 master-0 kubenswrapper[13205]: I0319 09:24:43.914818 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" podStartSLOduration=13.507257692 podStartE2EDuration="22.914802441s" podCreationTimestamp="2026-03-19 09:24:21 +0000 UTC" firstStartedPulling="2026-03-19 09:24:23.939048118 +0000 UTC m=+49.271355016" lastFinishedPulling="2026-03-19 09:24:33.346592837 +0000 UTC m=+58.678899765" observedRunningTime="2026-03-19 09:24:43.911332366 +0000 UTC m=+69.243639244" watchObservedRunningTime="2026-03-19 09:24:43.914802441 +0000 UTC m=+69.247109329" Mar 19 09:24:44.169686 master-0 kubenswrapper[13205]: I0319 09:24:44.169610 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-tkj2j" podStartSLOduration=7.794735455 podStartE2EDuration="23.169594668s" podCreationTimestamp="2026-03-19 09:24:21 +0000 UTC" firstStartedPulling="2026-03-19 09:24:24.894432514 +0000 UTC m=+50.226739402" lastFinishedPulling="2026-03-19 09:24:40.269291727 +0000 UTC m=+65.601598615" observedRunningTime="2026-03-19 09:24:44.169107656 +0000 UTC m=+69.501414554" watchObservedRunningTime="2026-03-19 09:24:44.169594668 +0000 UTC m=+69.501901556" Mar 19 09:24:44.368883 master-0 kubenswrapper[13205]: I0319 09:24:44.368804 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" podStartSLOduration=15.114635248999999 podStartE2EDuration="23.36878672s" podCreationTimestamp="2026-03-19 09:24:21 +0000 UTC" firstStartedPulling="2026-03-19 09:24:23.241122203 +0000 UTC m=+48.573429091" lastFinishedPulling="2026-03-19 09:24:31.495273674 +0000 UTC m=+56.827580562" observedRunningTime="2026-03-19 09:24:44.367200921 +0000 UTC m=+69.699507829" watchObservedRunningTime="2026-03-19 09:24:44.36878672 +0000 UTC m=+69.701093608" Mar 19 09:24:44.683760 master-0 kubenswrapper[13205]: I0319 09:24:44.683697 13205 generic.go:334] "Generic (PLEG): container finished" podID="8beda3a0-a653-4810-b3f2-d25badb21ab1" containerID="5a1232e74d2b81fa0fb089837e46ec811c58ea20165c36d4de9800956bf481df" exitCode=0 Mar 19 09:24:44.684270 master-0 kubenswrapper[13205]: I0319 09:24:44.683783 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" event={"ID":"8beda3a0-a653-4810-b3f2-d25badb21ab1","Type":"ContainerDied","Data":"5a1232e74d2b81fa0fb089837e46ec811c58ea20165c36d4de9800956bf481df"} Mar 19 09:24:44.686336 master-0 kubenswrapper[13205]: I0319 09:24:44.686302 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" event={"ID":"fd1bd769-25e2-4b9c-a112-28e2c624f9b7","Type":"ContainerStarted","Data":"b63698929f0bd191cb880464806df80750bd48efd40bb0f12e79cf0a685d4e46"} Mar 19 09:24:45.027749 master-0 kubenswrapper[13205]: I0319 09:24:45.024283 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-nbj7j" podStartSLOduration=3.024246097 podStartE2EDuration="3.024246097s" podCreationTimestamp="2026-03-19 09:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:24:45.020644948 +0000 UTC m=+70.352951876" watchObservedRunningTime="2026-03-19 09:24:45.024246097 +0000 UTC m=+70.356553005" Mar 19 09:24:46.502840 master-0 kubenswrapper[13205]: I0319 09:24:46.502778 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-cd9c5"] Mar 19 09:24:46.503518 master-0 kubenswrapper[13205]: I0319 09:24:46.503499 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cd9c5" Mar 19 09:24:46.505295 master-0 kubenswrapper[13205]: I0319 09:24:46.505253 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-qn2w4" Mar 19 09:24:46.505499 master-0 kubenswrapper[13205]: I0319 09:24:46.505470 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 09:24:46.646553 master-0 kubenswrapper[13205]: I0319 09:24:46.646478 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcftx\" (UniqueName: \"kubernetes.io/projected/9e6648a1-bdd4-4c53-921b-790e8308d8e3-kube-api-access-vcftx\") pod \"node-ca-cd9c5\" (UID: \"9e6648a1-bdd4-4c53-921b-790e8308d8e3\") " pod="openshift-image-registry/node-ca-cd9c5" Mar 19 09:24:46.646770 master-0 kubenswrapper[13205]: I0319 09:24:46.646578 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9e6648a1-bdd4-4c53-921b-790e8308d8e3-serviceca\") pod \"node-ca-cd9c5\" (UID: \"9e6648a1-bdd4-4c53-921b-790e8308d8e3\") " pod="openshift-image-registry/node-ca-cd9c5" Mar 19 09:24:46.646770 master-0 kubenswrapper[13205]: I0319 09:24:46.646603 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e6648a1-bdd4-4c53-921b-790e8308d8e3-host\") pod \"node-ca-cd9c5\" (UID: \"9e6648a1-bdd4-4c53-921b-790e8308d8e3\") " pod="openshift-image-registry/node-ca-cd9c5" Mar 19 09:24:46.747956 master-0 kubenswrapper[13205]: I0319 09:24:46.747907 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e6648a1-bdd4-4c53-921b-790e8308d8e3-host\") pod \"node-ca-cd9c5\" (UID: \"9e6648a1-bdd4-4c53-921b-790e8308d8e3\") " pod="openshift-image-registry/node-ca-cd9c5" Mar 19 09:24:46.748117 master-0 kubenswrapper[13205]: I0319 09:24:46.747991 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcftx\" (UniqueName: \"kubernetes.io/projected/9e6648a1-bdd4-4c53-921b-790e8308d8e3-kube-api-access-vcftx\") pod \"node-ca-cd9c5\" (UID: \"9e6648a1-bdd4-4c53-921b-790e8308d8e3\") " pod="openshift-image-registry/node-ca-cd9c5" Mar 19 09:24:46.748117 master-0 kubenswrapper[13205]: I0319 09:24:46.748046 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9e6648a1-bdd4-4c53-921b-790e8308d8e3-serviceca\") pod \"node-ca-cd9c5\" (UID: \"9e6648a1-bdd4-4c53-921b-790e8308d8e3\") " pod="openshift-image-registry/node-ca-cd9c5" Mar 19 09:24:46.748680 master-0 kubenswrapper[13205]: I0319 09:24:46.748656 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9e6648a1-bdd4-4c53-921b-790e8308d8e3-serviceca\") pod \"node-ca-cd9c5\" (UID: \"9e6648a1-bdd4-4c53-921b-790e8308d8e3\") " pod="openshift-image-registry/node-ca-cd9c5" Mar 19 09:24:46.748756 master-0 kubenswrapper[13205]: I0319 09:24:46.748721 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e6648a1-bdd4-4c53-921b-790e8308d8e3-host\") pod \"node-ca-cd9c5\" (UID: \"9e6648a1-bdd4-4c53-921b-790e8308d8e3\") " pod="openshift-image-registry/node-ca-cd9c5" Mar 19 09:24:47.027370 master-0 kubenswrapper[13205]: I0319 09:24:47.027331 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcftx\" (UniqueName: \"kubernetes.io/projected/9e6648a1-bdd4-4c53-921b-790e8308d8e3-kube-api-access-vcftx\") pod \"node-ca-cd9c5\" (UID: \"9e6648a1-bdd4-4c53-921b-790e8308d8e3\") " pod="openshift-image-registry/node-ca-cd9c5" Mar 19 09:24:47.173799 master-0 kubenswrapper[13205]: I0319 09:24:47.173736 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cd9c5" Mar 19 09:24:53.053503 master-0 kubenswrapper[13205]: I0319 09:24:53.053441 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:24:53.054252 master-0 kubenswrapper[13205]: E0319 09:24:53.053671 13205 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 19 09:24:53.054252 master-0 kubenswrapper[13205]: E0319 09:24:53.053749 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls podName:745093e5-ffe1-4443-b317-448948f3b311 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:25.053730304 +0000 UTC m=+110.386037202 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls") pod "dns-default-79jrh" (UID: "745093e5-ffe1-4443-b317-448948f3b311") : secret "dns-default-metrics-tls" not found Mar 19 09:24:55.670620 master-0 kubenswrapper[13205]: W0319 09:24:55.670565 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e6648a1_bdd4_4c53_921b_790e8308d8e3.slice/crio-9a5c55e1226c7a13023b8bc86a21a2ebdaa9fb63e8cd0dcd3413ab281448812a WatchSource:0}: Error finding container 9a5c55e1226c7a13023b8bc86a21a2ebdaa9fb63e8cd0dcd3413ab281448812a: Status 404 returned error can't find the container with id 9a5c55e1226c7a13023b8bc86a21a2ebdaa9fb63e8cd0dcd3413ab281448812a Mar 19 09:24:55.760657 master-0 kubenswrapper[13205]: I0319 09:24:55.760584 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cd9c5" event={"ID":"9e6648a1-bdd4-4c53-921b-790e8308d8e3","Type":"ContainerStarted","Data":"9a5c55e1226c7a13023b8bc86a21a2ebdaa9fb63e8cd0dcd3413ab281448812a"} Mar 19 09:24:57.773163 master-0 kubenswrapper[13205]: I0319 09:24:57.773110 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" event={"ID":"7c70267e-b555-4d56-92e4-f24b65b61283","Type":"ContainerStarted","Data":"f7a209e9add470bf3e45d1a7bc2114234ee8ecd9f4af52ecea5fb83e9a04b340"} Mar 19 09:24:58.779768 master-0 kubenswrapper[13205]: I0319 09:24:58.779684 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:25:00.167834 master-0 kubenswrapper[13205]: I0319 09:25:00.167750 13205 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-rfnfj container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.60:8443/healthz\": dial tcp 10.128.0.60:8443: connect: connection refused" start-of-body= Mar 19 09:25:00.167834 master-0 kubenswrapper[13205]: I0319 09:25:00.167817 13205 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-rfnfj container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.60:8443/healthz\": dial tcp 10.128.0.60:8443: connect: connection refused" start-of-body= Mar 19 09:25:00.167834 master-0 kubenswrapper[13205]: I0319 09:25:00.167820 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" podUID="7c70267e-b555-4d56-92e4-f24b65b61283" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.60:8443/healthz\": dial tcp 10.128.0.60:8443: connect: connection refused" Mar 19 09:25:00.168876 master-0 kubenswrapper[13205]: I0319 09:25:00.167874 13205 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" podUID="7c70267e-b555-4d56-92e4-f24b65b61283" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.60:8443/healthz\": dial tcp 10.128.0.60:8443: connect: connection refused" Mar 19 09:25:03.166198 master-0 kubenswrapper[13205]: I0319 09:25:03.166086 13205 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-rfnfj container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.60:8443/healthz\": dial tcp 10.128.0.60:8443: connect: connection refused" start-of-body= Mar 19 09:25:03.166198 master-0 kubenswrapper[13205]: I0319 09:25:03.166118 13205 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-rfnfj container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.60:8443/healthz\": dial tcp 10.128.0.60:8443: connect: connection refused" start-of-body= Mar 19 09:25:03.166198 master-0 kubenswrapper[13205]: I0319 09:25:03.166153 13205 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" podUID="7c70267e-b555-4d56-92e4-f24b65b61283" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.60:8443/healthz\": dial tcp 10.128.0.60:8443: connect: connection refused" Mar 19 09:25:03.167180 master-0 kubenswrapper[13205]: I0319 09:25:03.166211 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" podUID="7c70267e-b555-4d56-92e4-f24b65b61283" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.60:8443/healthz\": dial tcp 10.128.0.60:8443: connect: connection refused" Mar 19 09:25:06.179610 master-0 kubenswrapper[13205]: I0319 09:25:06.179347 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" Mar 19 09:25:06.268517 master-0 kubenswrapper[13205]: I0319 09:25:06.268446 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-rfnfj" podStartSLOduration=14.388025786 podStartE2EDuration="44.268430019s" podCreationTimestamp="2026-03-19 09:24:22 +0000 UTC" firstStartedPulling="2026-03-19 09:24:25.786571628 +0000 UTC m=+51.118878526" lastFinishedPulling="2026-03-19 09:24:55.666975831 +0000 UTC m=+80.999282759" observedRunningTime="2026-03-19 09:25:04.589902227 +0000 UTC m=+89.922209115" watchObservedRunningTime="2026-03-19 09:25:06.268430019 +0000 UTC m=+91.600736907" Mar 19 09:25:06.845071 master-0 kubenswrapper[13205]: I0319 09:25:06.845020 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" event={"ID":"c10d0e00-cf19-4067-b7bf-ff569f2f3d71","Type":"ContainerStarted","Data":"4c90f32d1065ba9ed17bf8788db8b1b14afa5ce6b93baff30fbc9e338ea1094e"} Mar 19 09:25:07.381273 master-0 kubenswrapper[13205]: I0319 09:25:07.381120 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b"] Mar 19 09:25:07.382379 master-0 kubenswrapper[13205]: I0319 09:25:07.382347 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" Mar 19 09:25:07.398619 master-0 kubenswrapper[13205]: I0319 09:25:07.397854 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 09:25:07.399095 master-0 kubenswrapper[13205]: I0319 09:25:07.399027 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-bxhvs" Mar 19 09:25:07.428300 master-0 kubenswrapper[13205]: I0319 09:25:07.428212 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-nmvcv" podStartSLOduration=15.297401651 podStartE2EDuration="44.428187663s" podCreationTimestamp="2026-03-19 09:24:23 +0000 UTC" firstStartedPulling="2026-03-19 09:24:37.203721033 +0000 UTC m=+62.536027921" lastFinishedPulling="2026-03-19 09:25:06.334507045 +0000 UTC m=+91.666813933" observedRunningTime="2026-03-19 09:25:07.424783367 +0000 UTC m=+92.757090285" watchObservedRunningTime="2026-03-19 09:25:07.428187663 +0000 UTC m=+92.760494561" Mar 19 09:25:07.432469 master-0 kubenswrapper[13205]: I0319 09:25:07.432407 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b"] Mar 19 09:25:07.578032 master-0 kubenswrapper[13205]: I0319 09:25:07.577934 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77hsf\" (UniqueName: \"kubernetes.io/projected/6f17d1d6-e68a-458d-993b-e6a442281a9c-kube-api-access-77hsf\") pod \"machine-config-controller-b4f87c5b9-5ft4b\" (UID: \"6f17d1d6-e68a-458d-993b-e6a442281a9c\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" Mar 19 09:25:07.578032 master-0 kubenswrapper[13205]: I0319 09:25:07.578009 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f17d1d6-e68a-458d-993b-e6a442281a9c-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-5ft4b\" (UID: \"6f17d1d6-e68a-458d-993b-e6a442281a9c\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" Mar 19 09:25:07.578263 master-0 kubenswrapper[13205]: I0319 09:25:07.578072 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6f17d1d6-e68a-458d-993b-e6a442281a9c-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-5ft4b\" (UID: \"6f17d1d6-e68a-458d-993b-e6a442281a9c\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" Mar 19 09:25:07.679477 master-0 kubenswrapper[13205]: I0319 09:25:07.679373 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f17d1d6-e68a-458d-993b-e6a442281a9c-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-5ft4b\" (UID: \"6f17d1d6-e68a-458d-993b-e6a442281a9c\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" Mar 19 09:25:07.679739 master-0 kubenswrapper[13205]: I0319 09:25:07.679721 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6f17d1d6-e68a-458d-993b-e6a442281a9c-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-5ft4b\" (UID: \"6f17d1d6-e68a-458d-993b-e6a442281a9c\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" Mar 19 09:25:07.679909 master-0 kubenswrapper[13205]: I0319 09:25:07.679895 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77hsf\" (UniqueName: \"kubernetes.io/projected/6f17d1d6-e68a-458d-993b-e6a442281a9c-kube-api-access-77hsf\") pod \"machine-config-controller-b4f87c5b9-5ft4b\" (UID: \"6f17d1d6-e68a-458d-993b-e6a442281a9c\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" Mar 19 09:25:07.684229 master-0 kubenswrapper[13205]: I0319 09:25:07.684200 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6f17d1d6-e68a-458d-993b-e6a442281a9c-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-5ft4b\" (UID: \"6f17d1d6-e68a-458d-993b-e6a442281a9c\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" Mar 19 09:25:07.695740 master-0 kubenswrapper[13205]: I0319 09:25:07.695660 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f17d1d6-e68a-458d-993b-e6a442281a9c-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-5ft4b\" (UID: \"6f17d1d6-e68a-458d-993b-e6a442281a9c\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" Mar 19 09:25:07.724164 master-0 kubenswrapper[13205]: I0319 09:25:07.724135 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77hsf\" (UniqueName: \"kubernetes.io/projected/6f17d1d6-e68a-458d-993b-e6a442281a9c-kube-api-access-77hsf\") pod \"machine-config-controller-b4f87c5b9-5ft4b\" (UID: \"6f17d1d6-e68a-458d-993b-e6a442281a9c\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" Mar 19 09:25:07.852088 master-0 kubenswrapper[13205]: I0319 09:25:07.852029 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cd9c5" event={"ID":"9e6648a1-bdd4-4c53-921b-790e8308d8e3","Type":"ContainerStarted","Data":"29283da2f664f1588716a9a06de56f7e52a846371699ee69f7468e2b7f8be9d1"} Mar 19 09:25:08.003290 master-0 kubenswrapper[13205]: I0319 09:25:08.003104 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" Mar 19 09:25:08.104102 master-0 kubenswrapper[13205]: I0319 09:25:08.099791 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cd9c5" podStartSLOduration=11.128869943 podStartE2EDuration="22.099773899s" podCreationTimestamp="2026-03-19 09:24:46 +0000 UTC" firstStartedPulling="2026-03-19 09:24:55.672515338 +0000 UTC m=+81.004822226" lastFinishedPulling="2026-03-19 09:25:06.643419294 +0000 UTC m=+91.975726182" observedRunningTime="2026-03-19 09:25:08.099162064 +0000 UTC m=+93.431468952" watchObservedRunningTime="2026-03-19 09:25:08.099773899 +0000 UTC m=+93.432080807" Mar 19 09:25:08.628545 master-0 kubenswrapper[13205]: I0319 09:25:08.627419 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b"] Mar 19 09:25:08.859450 master-0 kubenswrapper[13205]: I0319 09:25:08.859394 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" event={"ID":"6f17d1d6-e68a-458d-993b-e6a442281a9c","Type":"ContainerStarted","Data":"18c44947a2b0eae352b0de40c6a23534d5d8c260606559955b7f5484f7828d3a"} Mar 19 09:25:08.859450 master-0 kubenswrapper[13205]: I0319 09:25:08.859444 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" event={"ID":"6f17d1d6-e68a-458d-993b-e6a442281a9c","Type":"ContainerStarted","Data":"72b8d729964d85c812b2864fb3375049f29ac9148f07ea9d78705baf5c4dc791"} Mar 19 09:25:09.579948 master-0 kubenswrapper[13205]: I0319 09:25:09.579900 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 09:25:09.580627 master-0 kubenswrapper[13205]: I0319 09:25:09.580601 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:09.583015 master-0 kubenswrapper[13205]: I0319 09:25:09.582972 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-qzwhq" Mar 19 09:25:09.587363 master-0 kubenswrapper[13205]: I0319 09:25:09.587313 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 09:25:09.645957 master-0 kubenswrapper[13205]: I0319 09:25:09.644697 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n"] Mar 19 09:25:09.645957 master-0 kubenswrapper[13205]: I0319 09:25:09.645740 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n" Mar 19 09:25:09.647122 master-0 kubenswrapper[13205]: I0319 09:25:09.647079 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7dcf5569b5-29cq2"] Mar 19 09:25:09.647382 master-0 kubenswrapper[13205]: I0319 09:25:09.647345 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 09:25:09.647734 master-0 kubenswrapper[13205]: I0319 09:25:09.647701 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-qttf4" Mar 19 09:25:09.647912 master-0 kubenswrapper[13205]: I0319 09:25:09.647875 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.649709 master-0 kubenswrapper[13205]: I0319 09:25:09.649657 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 09:25:09.649709 master-0 kubenswrapper[13205]: I0319 09:25:09.649664 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-r2bsk" Mar 19 09:25:09.649935 master-0 kubenswrapper[13205]: I0319 09:25:09.649884 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-wwd24"] Mar 19 09:25:09.650052 master-0 kubenswrapper[13205]: I0319 09:25:09.649914 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 09:25:09.650103 master-0 kubenswrapper[13205]: I0319 09:25:09.650062 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 09:25:09.650142 master-0 kubenswrapper[13205]: I0319 09:25:09.650109 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 09:25:09.650142 master-0 kubenswrapper[13205]: I0319 09:25:09.650031 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 09:25:09.650250 master-0 kubenswrapper[13205]: I0319 09:25:09.650208 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 09:25:09.650938 master-0 kubenswrapper[13205]: I0319 09:25:09.650908 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wwd24" Mar 19 09:25:09.706688 master-0 kubenswrapper[13205]: I0319 09:25:09.706638 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/155018d1-af14-4adc-b7a0-cab0133dd65f-kube-api-access\") pod \"installer-2-master-0\" (UID: \"155018d1-af14-4adc-b7a0-cab0133dd65f\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:09.706860 master-0 kubenswrapper[13205]: I0319 09:25:09.706699 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/155018d1-af14-4adc-b7a0-cab0133dd65f-var-lock\") pod \"installer-2-master-0\" (UID: \"155018d1-af14-4adc-b7a0-cab0133dd65f\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:09.706860 master-0 kubenswrapper[13205]: I0319 09:25:09.706742 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/155018d1-af14-4adc-b7a0-cab0133dd65f-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"155018d1-af14-4adc-b7a0-cab0133dd65f\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:09.819449 master-0 kubenswrapper[13205]: I0319 09:25:09.819368 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-metrics-certs\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.819785 master-0 kubenswrapper[13205]: I0319 09:25:09.819460 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-default-certificate\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.819785 master-0 kubenswrapper[13205]: I0319 09:25:09.819502 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/155018d1-af14-4adc-b7a0-cab0133dd65f-var-lock\") pod \"installer-2-master-0\" (UID: \"155018d1-af14-4adc-b7a0-cab0133dd65f\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:09.819785 master-0 kubenswrapper[13205]: I0319 09:25:09.819548 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5bgk\" (UniqueName: \"kubernetes.io/projected/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-kube-api-access-z5bgk\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.819785 master-0 kubenswrapper[13205]: I0319 09:25:09.819607 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/155018d1-af14-4adc-b7a0-cab0133dd65f-var-lock\") pod \"installer-2-master-0\" (UID: \"155018d1-af14-4adc-b7a0-cab0133dd65f\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:09.819785 master-0 kubenswrapper[13205]: I0319 09:25:09.819650 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-service-ca-bundle\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.820152 master-0 kubenswrapper[13205]: I0319 09:25:09.819823 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/155018d1-af14-4adc-b7a0-cab0133dd65f-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"155018d1-af14-4adc-b7a0-cab0133dd65f\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:09.820152 master-0 kubenswrapper[13205]: I0319 09:25:09.819866 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/155018d1-af14-4adc-b7a0-cab0133dd65f-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"155018d1-af14-4adc-b7a0-cab0133dd65f\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:09.820152 master-0 kubenswrapper[13205]: I0319 09:25:09.819968 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsl2r\" (UniqueName: \"kubernetes.io/projected/e71d0aae-5e50-45f9-a0f2-8f3c3357976a-kube-api-access-bsl2r\") pod \"network-check-source-b4bf74f6-wwd24\" (UID: \"e71d0aae-5e50-45f9-a0f2-8f3c3357976a\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wwd24" Mar 19 09:25:09.820152 master-0 kubenswrapper[13205]: I0319 09:25:09.820037 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/155018d1-af14-4adc-b7a0-cab0133dd65f-kube-api-access\") pod \"installer-2-master-0\" (UID: \"155018d1-af14-4adc-b7a0-cab0133dd65f\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:09.820152 master-0 kubenswrapper[13205]: I0319 09:25:09.820099 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b0bc69d1-1383-478f-9a9e-c23e88646056-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-7wt9n\" (UID: \"b0bc69d1-1383-478f-9a9e-c23e88646056\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n" Mar 19 09:25:09.820152 master-0 kubenswrapper[13205]: I0319 09:25:09.820126 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-stats-auth\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.821959 master-0 kubenswrapper[13205]: I0319 09:25:09.821891 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 09:25:09.824126 master-0 kubenswrapper[13205]: I0319 09:25:09.824043 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n"] Mar 19 09:25:09.826171 master-0 kubenswrapper[13205]: I0319 09:25:09.826093 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-wwd24"] Mar 19 09:25:09.866561 master-0 kubenswrapper[13205]: I0319 09:25:09.866419 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" event={"ID":"6f17d1d6-e68a-458d-993b-e6a442281a9c","Type":"ContainerStarted","Data":"83cb446ad522a507b9a7a866476e32e5c318eb6e06d075e26107b897d0b4ff06"} Mar 19 09:25:09.920887 master-0 kubenswrapper[13205]: I0319 09:25:09.920801 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b0bc69d1-1383-478f-9a9e-c23e88646056-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-7wt9n\" (UID: \"b0bc69d1-1383-478f-9a9e-c23e88646056\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n" Mar 19 09:25:09.920887 master-0 kubenswrapper[13205]: I0319 09:25:09.920864 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-stats-auth\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.920887 master-0 kubenswrapper[13205]: I0319 09:25:09.920892 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-metrics-certs\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.921165 master-0 kubenswrapper[13205]: I0319 09:25:09.920911 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-default-certificate\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.921211 master-0 kubenswrapper[13205]: I0319 09:25:09.921172 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5bgk\" (UniqueName: \"kubernetes.io/projected/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-kube-api-access-z5bgk\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.921328 master-0 kubenswrapper[13205]: I0319 09:25:09.921283 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-service-ca-bundle\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.921902 master-0 kubenswrapper[13205]: I0319 09:25:09.921510 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsl2r\" (UniqueName: \"kubernetes.io/projected/e71d0aae-5e50-45f9-a0f2-8f3c3357976a-kube-api-access-bsl2r\") pod \"network-check-source-b4bf74f6-wwd24\" (UID: \"e71d0aae-5e50-45f9-a0f2-8f3c3357976a\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wwd24" Mar 19 09:25:09.922777 master-0 kubenswrapper[13205]: I0319 09:25:09.922716 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-service-ca-bundle\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.924649 master-0 kubenswrapper[13205]: I0319 09:25:09.924601 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-stats-auth\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.924925 master-0 kubenswrapper[13205]: I0319 09:25:09.924880 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b0bc69d1-1383-478f-9a9e-c23e88646056-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-7wt9n\" (UID: \"b0bc69d1-1383-478f-9a9e-c23e88646056\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n" Mar 19 09:25:09.925234 master-0 kubenswrapper[13205]: I0319 09:25:09.925207 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-default-certificate\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.925275 master-0 kubenswrapper[13205]: I0319 09:25:09.925216 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-metrics-certs\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:09.969595 master-0 kubenswrapper[13205]: I0319 09:25:09.969518 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n" Mar 19 09:25:10.183211 master-0 kubenswrapper[13205]: I0319 09:25:10.183077 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bf4j2"] Mar 19 09:25:10.183899 master-0 kubenswrapper[13205]: I0319 09:25:10.183854 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-k4dfd_e7fae040-28fa-4d97-8482-fd0dd12cc921/authentication-operator/0.log" Mar 19 09:25:10.184072 master-0 kubenswrapper[13205]: I0319 09:25:10.184034 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bf4j2" Mar 19 09:25:10.185844 master-0 kubenswrapper[13205]: I0319 09:25:10.185808 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-p5dd4" Mar 19 09:25:10.186005 master-0 kubenswrapper[13205]: I0319 09:25:10.185947 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 09:25:10.186462 master-0 kubenswrapper[13205]: I0319 09:25:10.186432 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 09:25:10.186514 master-0 kubenswrapper[13205]: I0319 09:25:10.186440 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 09:25:10.189050 master-0 kubenswrapper[13205]: I0319 09:25:10.189012 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/155018d1-af14-4adc-b7a0-cab0133dd65f-kube-api-access\") pod \"installer-2-master-0\" (UID: \"155018d1-af14-4adc-b7a0-cab0133dd65f\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:10.206581 master-0 kubenswrapper[13205]: I0319 09:25:10.206498 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:10.327194 master-0 kubenswrapper[13205]: I0319 09:25:10.327140 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dedbf9d8-ffec-4bcb-b175-20efd7b7366e-cert\") pod \"ingress-canary-bf4j2\" (UID: \"dedbf9d8-ffec-4bcb-b175-20efd7b7366e\") " pod="openshift-ingress-canary/ingress-canary-bf4j2" Mar 19 09:25:10.327357 master-0 kubenswrapper[13205]: I0319 09:25:10.327242 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcv4l\" (UniqueName: \"kubernetes.io/projected/dedbf9d8-ffec-4bcb-b175-20efd7b7366e-kube-api-access-pcv4l\") pod \"ingress-canary-bf4j2\" (UID: \"dedbf9d8-ffec-4bcb-b175-20efd7b7366e\") " pod="openshift-ingress-canary/ingress-canary-bf4j2" Mar 19 09:25:10.392555 master-0 kubenswrapper[13205]: I0319 09:25:10.391266 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bf4j2"] Mar 19 09:25:10.398639 master-0 kubenswrapper[13205]: I0319 09:25:10.398594 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsl2r\" (UniqueName: \"kubernetes.io/projected/e71d0aae-5e50-45f9-a0f2-8f3c3357976a-kube-api-access-bsl2r\") pod \"network-check-source-b4bf74f6-wwd24\" (UID: \"e71d0aae-5e50-45f9-a0f2-8f3c3357976a\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wwd24" Mar 19 09:25:10.401441 master-0 kubenswrapper[13205]: I0319 09:25:10.401391 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-k4dfd_e7fae040-28fa-4d97-8482-fd0dd12cc921/authentication-operator/1.log" Mar 19 09:25:10.411818 master-0 kubenswrapper[13205]: I0319 09:25:10.411723 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5bgk\" (UniqueName: \"kubernetes.io/projected/0f7b58ba-ff67-416a-880a-b7e0f9a6e35f-kube-api-access-z5bgk\") pod \"router-default-7dcf5569b5-29cq2\" (UID: \"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f\") " pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:10.416115 master-0 kubenswrapper[13205]: I0319 09:25:10.416057 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n"] Mar 19 09:25:10.420590 master-0 kubenswrapper[13205]: W0319 09:25:10.420544 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0bc69d1_1383_478f_9a9e_c23e88646056.slice/crio-35cc3a33b30febb7ac522c9551d1d0a13d6a94838c99a301c721be69ca842c6a WatchSource:0}: Error finding container 35cc3a33b30febb7ac522c9551d1d0a13d6a94838c99a301c721be69ca842c6a: Status 404 returned error can't find the container with id 35cc3a33b30febb7ac522c9551d1d0a13d6a94838c99a301c721be69ca842c6a Mar 19 09:25:10.428211 master-0 kubenswrapper[13205]: I0319 09:25:10.428144 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dedbf9d8-ffec-4bcb-b175-20efd7b7366e-cert\") pod \"ingress-canary-bf4j2\" (UID: \"dedbf9d8-ffec-4bcb-b175-20efd7b7366e\") " pod="openshift-ingress-canary/ingress-canary-bf4j2" Mar 19 09:25:10.428375 master-0 kubenswrapper[13205]: I0319 09:25:10.428233 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcv4l\" (UniqueName: \"kubernetes.io/projected/dedbf9d8-ffec-4bcb-b175-20efd7b7366e-kube-api-access-pcv4l\") pod \"ingress-canary-bf4j2\" (UID: \"dedbf9d8-ffec-4bcb-b175-20efd7b7366e\") " pod="openshift-ingress-canary/ingress-canary-bf4j2" Mar 19 09:25:10.434708 master-0 kubenswrapper[13205]: I0319 09:25:10.434586 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dedbf9d8-ffec-4bcb-b175-20efd7b7366e-cert\") pod \"ingress-canary-bf4j2\" (UID: \"dedbf9d8-ffec-4bcb-b175-20efd7b7366e\") " pod="openshift-ingress-canary/ingress-canary-bf4j2" Mar 19 09:25:10.440429 master-0 kubenswrapper[13205]: I0319 09:25:10.440391 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-57c47bdf6-d9h47_5a51c701-7f2a-4332-a301-746e8a0eb475/fix-audit-permissions/0.log" Mar 19 09:25:10.449561 master-0 kubenswrapper[13205]: I0319 09:25:10.449457 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcv4l\" (UniqueName: \"kubernetes.io/projected/dedbf9d8-ffec-4bcb-b175-20efd7b7366e-kube-api-access-pcv4l\") pod \"ingress-canary-bf4j2\" (UID: \"dedbf9d8-ffec-4bcb-b175-20efd7b7366e\") " pod="openshift-ingress-canary/ingress-canary-bf4j2" Mar 19 09:25:10.523855 master-0 kubenswrapper[13205]: I0319 09:25:10.523792 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bf4j2" Mar 19 09:25:10.538584 master-0 kubenswrapper[13205]: I0319 09:25:10.538481 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5ft4b" podStartSLOduration=4.538465436 podStartE2EDuration="4.538465436s" podCreationTimestamp="2026-03-19 09:25:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:10.532363504 +0000 UTC m=+95.864670392" watchObservedRunningTime="2026-03-19 09:25:10.538465436 +0000 UTC m=+95.870772324" Mar 19 09:25:10.587371 master-0 kubenswrapper[13205]: I0319 09:25:10.585482 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:10.596782 master-0 kubenswrapper[13205]: I0319 09:25:10.596737 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wwd24" Mar 19 09:25:10.622771 master-0 kubenswrapper[13205]: I0319 09:25:10.622499 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 09:25:10.640968 master-0 kubenswrapper[13205]: I0319 09:25:10.640931 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-57c47bdf6-d9h47_5a51c701-7f2a-4332-a301-746e8a0eb475/oauth-apiserver/0.log" Mar 19 09:25:10.840642 master-0 kubenswrapper[13205]: I0319 09:25:10.840577 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-9c5679d8f-cbw4r_16d2930b-486b-492d-983e-c6702d8f53a7/dns-operator/0.log" Mar 19 09:25:10.881086 master-0 kubenswrapper[13205]: I0319 09:25:10.872308 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"155018d1-af14-4adc-b7a0-cab0133dd65f","Type":"ContainerStarted","Data":"15511e8ed5c9afc7170df0ca1831b83c9ff949924ce0e3b4e77f5ca91a7e9907"} Mar 19 09:25:10.881086 master-0 kubenswrapper[13205]: I0319 09:25:10.873352 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-29cq2" event={"ID":"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f","Type":"ContainerStarted","Data":"06c79d6ab304bd9a80eb36aaae2ab21c9d5b792d6d1d54331c999ea8dbcef6d7"} Mar 19 09:25:10.881086 master-0 kubenswrapper[13205]: I0319 09:25:10.874953 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n" event={"ID":"b0bc69d1-1383-478f-9a9e-c23e88646056","Type":"ContainerStarted","Data":"35cc3a33b30febb7ac522c9551d1d0a13d6a94838c99a301c721be69ca842c6a"} Mar 19 09:25:10.945371 master-0 kubenswrapper[13205]: I0319 09:25:10.945271 13205 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:25:10.979488 master-0 kubenswrapper[13205]: I0319 09:25:10.979408 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bf4j2"] Mar 19 09:25:11.095582 master-0 kubenswrapper[13205]: I0319 09:25:11.094922 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-9c5679d8f-cbw4r_16d2930b-486b-492d-983e-c6702d8f53a7/kube-rbac-proxy/0.log" Mar 19 09:25:11.212862 master-0 kubenswrapper[13205]: I0319 09:25:11.212155 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-wwd24"] Mar 19 09:25:11.649351 master-0 kubenswrapper[13205]: I0319 09:25:11.649319 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7llkw_89f3f27a-83eb-4cd9-b557-aeee15998793/dns-node-resolver/0.log" Mar 19 09:25:11.843675 master-0 kubenswrapper[13205]: I0319 09:25:11.842933 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-5bddk_a1098584-43b9-4f2c-83d2-22d95fb7b0c3/etcd-operator/0.log" Mar 19 09:25:11.883183 master-0 kubenswrapper[13205]: I0319 09:25:11.883126 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wwd24" event={"ID":"e71d0aae-5e50-45f9-a0f2-8f3c3357976a","Type":"ContainerStarted","Data":"a6f53a8beaeb4c01eaca85f77ba064599f7c7389962d9c6722b68937bd8ffcb1"} Mar 19 09:25:11.883610 master-0 kubenswrapper[13205]: I0319 09:25:11.883586 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wwd24" event={"ID":"e71d0aae-5e50-45f9-a0f2-8f3c3357976a","Type":"ContainerStarted","Data":"7465e2f5cb8d93f826e9e4db86cc062e2120edfc9463070e7258e7c798b5bf89"} Mar 19 09:25:11.886211 master-0 kubenswrapper[13205]: I0319 09:25:11.886164 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"155018d1-af14-4adc-b7a0-cab0133dd65f","Type":"ContainerStarted","Data":"28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c"} Mar 19 09:25:11.888045 master-0 kubenswrapper[13205]: I0319 09:25:11.888001 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bf4j2" event={"ID":"dedbf9d8-ffec-4bcb-b175-20efd7b7366e","Type":"ContainerStarted","Data":"c45905412b674f9aedb6f73205183be69e0c539de0f6353d65355e246f757d9e"} Mar 19 09:25:11.888045 master-0 kubenswrapper[13205]: I0319 09:25:11.888041 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bf4j2" event={"ID":"dedbf9d8-ffec-4bcb-b175-20efd7b7366e","Type":"ContainerStarted","Data":"d14c569106f509e4d4252c804619ac607254868741e3bcc3cf7d6c7e065db4ba"} Mar 19 09:25:12.004598 master-0 kubenswrapper[13205]: I0319 09:25:12.004384 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wwd24" podStartSLOduration=374.004361079 podStartE2EDuration="6m14.004361079s" podCreationTimestamp="2026-03-19 09:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:12.002292227 +0000 UTC m=+97.334599135" watchObservedRunningTime="2026-03-19 09:25:12.004361079 +0000 UTC m=+97.336667967" Mar 19 09:25:12.093793 master-0 kubenswrapper[13205]: I0319 09:25:12.093634 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-5bddk_a1098584-43b9-4f2c-83d2-22d95fb7b0c3/etcd-operator/1.log" Mar 19 09:25:12.281434 master-0 kubenswrapper[13205]: I0319 09:25:12.278342 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bf4j2" podStartSLOduration=3.278318766 podStartE2EDuration="3.278318766s" podCreationTimestamp="2026-03-19 09:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:12.271069785 +0000 UTC m=+97.603376683" watchObservedRunningTime="2026-03-19 09:25:12.278318766 +0000 UTC m=+97.610625654" Mar 19 09:25:12.281434 master-0 kubenswrapper[13205]: I0319 09:25:12.278928 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=3.278922471 podStartE2EDuration="3.278922471s" podCreationTimestamp="2026-03-19 09:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:12.044281253 +0000 UTC m=+97.376588141" watchObservedRunningTime="2026-03-19 09:25:12.278922471 +0000 UTC m=+97.611229359" Mar 19 09:25:12.293732 master-0 kubenswrapper[13205]: I0319 09:25:12.293675 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/setup/0.log" Mar 19 09:25:12.454434 master-0 kubenswrapper[13205]: I0319 09:25:12.452364 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-ensure-env-vars/0.log" Mar 19 09:25:12.642776 master-0 kubenswrapper[13205]: I0319 09:25:12.642611 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-resources-copy/0.log" Mar 19 09:25:12.850296 master-0 kubenswrapper[13205]: I0319 09:25:12.850154 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 19 09:25:12.902611 master-0 kubenswrapper[13205]: I0319 09:25:12.902464 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-rsnsn_6904be4c-4f5f-4176-8100-7b6955c6d8da/cluster-autoscaler-operator/0.log" Mar 19 09:25:12.903330 master-0 kubenswrapper[13205]: I0319 09:25:12.903281 13205 generic.go:334] "Generic (PLEG): container finished" podID="6904be4c-4f5f-4176-8100-7b6955c6d8da" containerID="7a13e386375bce18c61ee7b6a70f8617363b05476b2f78b3d5995802bea93bef" exitCode=255 Mar 19 09:25:12.904308 master-0 kubenswrapper[13205]: I0319 09:25:12.904258 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" event={"ID":"6904be4c-4f5f-4176-8100-7b6955c6d8da","Type":"ContainerDied","Data":"7a13e386375bce18c61ee7b6a70f8617363b05476b2f78b3d5995802bea93bef"} Mar 19 09:25:12.905154 master-0 kubenswrapper[13205]: I0319 09:25:12.905124 13205 scope.go:117] "RemoveContainer" containerID="7a13e386375bce18c61ee7b6a70f8617363b05476b2f78b3d5995802bea93bef" Mar 19 09:25:13.054162 master-0 kubenswrapper[13205]: I0319 09:25:13.054112 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd/0.log" Mar 19 09:25:13.246544 master-0 kubenswrapper[13205]: I0319 09:25:13.242721 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 09:25:13.313099 master-0 kubenswrapper[13205]: I0319 09:25:13.313040 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-kqb7f"] Mar 19 09:25:13.315801 master-0 kubenswrapper[13205]: I0319 09:25:13.315743 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kqb7f" Mar 19 09:25:13.318317 master-0 kubenswrapper[13205]: I0319 09:25:13.317774 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-sjz5s" Mar 19 09:25:13.318465 master-0 kubenswrapper[13205]: I0319 09:25:13.318431 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 09:25:13.319141 master-0 kubenswrapper[13205]: I0319 09:25:13.319114 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 09:25:13.446440 master-0 kubenswrapper[13205]: I0319 09:25:13.446384 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-readyz/0.log" Mar 19 09:25:13.505382 master-0 kubenswrapper[13205]: I0319 09:25:13.505229 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1519cbae-bedd-446b-8600-e5d040b98fc9-certs\") pod \"machine-config-server-kqb7f\" (UID: \"1519cbae-bedd-446b-8600-e5d040b98fc9\") " pod="openshift-machine-config-operator/machine-config-server-kqb7f" Mar 19 09:25:13.505382 master-0 kubenswrapper[13205]: I0319 09:25:13.505284 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdr2j\" (UniqueName: \"kubernetes.io/projected/1519cbae-bedd-446b-8600-e5d040b98fc9-kube-api-access-jdr2j\") pod \"machine-config-server-kqb7f\" (UID: \"1519cbae-bedd-446b-8600-e5d040b98fc9\") " pod="openshift-machine-config-operator/machine-config-server-kqb7f" Mar 19 09:25:13.505382 master-0 kubenswrapper[13205]: I0319 09:25:13.505353 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1519cbae-bedd-446b-8600-e5d040b98fc9-node-bootstrap-token\") pod \"machine-config-server-kqb7f\" (UID: \"1519cbae-bedd-446b-8600-e5d040b98fc9\") " pod="openshift-machine-config-operator/machine-config-server-kqb7f" Mar 19 09:25:13.606184 master-0 kubenswrapper[13205]: I0319 09:25:13.606123 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1519cbae-bedd-446b-8600-e5d040b98fc9-certs\") pod \"machine-config-server-kqb7f\" (UID: \"1519cbae-bedd-446b-8600-e5d040b98fc9\") " pod="openshift-machine-config-operator/machine-config-server-kqb7f" Mar 19 09:25:13.606566 master-0 kubenswrapper[13205]: I0319 09:25:13.606512 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdr2j\" (UniqueName: \"kubernetes.io/projected/1519cbae-bedd-446b-8600-e5d040b98fc9-kube-api-access-jdr2j\") pod \"machine-config-server-kqb7f\" (UID: \"1519cbae-bedd-446b-8600-e5d040b98fc9\") " pod="openshift-machine-config-operator/machine-config-server-kqb7f" Mar 19 09:25:13.606814 master-0 kubenswrapper[13205]: I0319 09:25:13.606768 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1519cbae-bedd-446b-8600-e5d040b98fc9-node-bootstrap-token\") pod \"machine-config-server-kqb7f\" (UID: \"1519cbae-bedd-446b-8600-e5d040b98fc9\") " pod="openshift-machine-config-operator/machine-config-server-kqb7f" Mar 19 09:25:13.611203 master-0 kubenswrapper[13205]: I0319 09:25:13.611171 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1519cbae-bedd-446b-8600-e5d040b98fc9-node-bootstrap-token\") pod \"machine-config-server-kqb7f\" (UID: \"1519cbae-bedd-446b-8600-e5d040b98fc9\") " pod="openshift-machine-config-operator/machine-config-server-kqb7f" Mar 19 09:25:13.613496 master-0 kubenswrapper[13205]: I0319 09:25:13.613471 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1519cbae-bedd-446b-8600-e5d040b98fc9-certs\") pod \"machine-config-server-kqb7f\" (UID: \"1519cbae-bedd-446b-8600-e5d040b98fc9\") " pod="openshift-machine-config-operator/machine-config-server-kqb7f" Mar 19 09:25:13.628140 master-0 kubenswrapper[13205]: I0319 09:25:13.627609 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdr2j\" (UniqueName: \"kubernetes.io/projected/1519cbae-bedd-446b-8600-e5d040b98fc9-kube-api-access-jdr2j\") pod \"machine-config-server-kqb7f\" (UID: \"1519cbae-bedd-446b-8600-e5d040b98fc9\") " pod="openshift-machine-config-operator/machine-config-server-kqb7f" Mar 19 09:25:13.643338 master-0 kubenswrapper[13205]: I0319 09:25:13.642606 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 09:25:13.649546 master-0 kubenswrapper[13205]: I0319 09:25:13.649473 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kqb7f" Mar 19 09:25:13.814864 master-0 kubenswrapper[13205]: W0319 09:25:13.814808 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1519cbae_bedd_446b_8600_e5d040b98fc9.slice/crio-db065898eea56481cc36b4b8a6e377b2fa44f099d017478071918d620dc7ffbe WatchSource:0}: Error finding container db065898eea56481cc36b4b8a6e377b2fa44f099d017478071918d620dc7ffbe: Status 404 returned error can't find the container with id db065898eea56481cc36b4b8a6e377b2fa44f099d017478071918d620dc7ffbe Mar 19 09:25:13.839029 master-0 kubenswrapper[13205]: I0319 09:25:13.838974 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a/installer/0.log" Mar 19 09:25:13.917100 master-0 kubenswrapper[13205]: I0319 09:25:13.917058 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5dbbb8b86f-fvh8d_8beda3a0-a653-4810-b3f2-d25badb21ab1/multus-admission-controller/0.log" Mar 19 09:25:13.927942 master-0 kubenswrapper[13205]: I0319 09:25:13.917106 13205 generic.go:334] "Generic (PLEG): container finished" podID="8beda3a0-a653-4810-b3f2-d25badb21ab1" containerID="ec2a8f37c4a4bf290761beb86b8148cabc7c9a7b8241accf763dd14e9ad11acc" exitCode=137 Mar 19 09:25:13.927942 master-0 kubenswrapper[13205]: I0319 09:25:13.917158 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" event={"ID":"8beda3a0-a653-4810-b3f2-d25badb21ab1","Type":"ContainerDied","Data":"ec2a8f37c4a4bf290761beb86b8148cabc7c9a7b8241accf763dd14e9ad11acc"} Mar 19 09:25:13.927942 master-0 kubenswrapper[13205]: I0319 09:25:13.918223 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kqb7f" event={"ID":"1519cbae-bedd-446b-8600-e5d040b98fc9","Type":"ContainerStarted","Data":"db065898eea56481cc36b4b8a6e377b2fa44f099d017478071918d620dc7ffbe"} Mar 19 09:25:14.041696 master-0 kubenswrapper[13205]: I0319 09:25:14.041654 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-rvwfh_03d12dab-1215-4c1f-a9f5-27ea7174d308/ingress-operator/0.log" Mar 19 09:25:14.146344 master-0 kubenswrapper[13205]: I0319 09:25:14.146297 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5dbbb8b86f-fvh8d_8beda3a0-a653-4810-b3f2-d25badb21ab1/multus-admission-controller/0.log" Mar 19 09:25:14.146515 master-0 kubenswrapper[13205]: I0319 09:25:14.146402 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:25:14.215471 master-0 kubenswrapper[13205]: I0319 09:25:14.215376 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgtgw\" (UniqueName: \"kubernetes.io/projected/8beda3a0-a653-4810-b3f2-d25badb21ab1-kube-api-access-tgtgw\") pod \"8beda3a0-a653-4810-b3f2-d25badb21ab1\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " Mar 19 09:25:14.215725 master-0 kubenswrapper[13205]: I0319 09:25:14.215691 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") pod \"8beda3a0-a653-4810-b3f2-d25badb21ab1\" (UID: \"8beda3a0-a653-4810-b3f2-d25badb21ab1\") " Mar 19 09:25:14.219699 master-0 kubenswrapper[13205]: I0319 09:25:14.219637 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8beda3a0-a653-4810-b3f2-d25badb21ab1-kube-api-access-tgtgw" (OuterVolumeSpecName: "kube-api-access-tgtgw") pod "8beda3a0-a653-4810-b3f2-d25badb21ab1" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1"). InnerVolumeSpecName "kube-api-access-tgtgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:25:14.220341 master-0 kubenswrapper[13205]: I0319 09:25:14.220302 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "8beda3a0-a653-4810-b3f2-d25badb21ab1" (UID: "8beda3a0-a653-4810-b3f2-d25badb21ab1"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:25:14.236741 master-0 kubenswrapper[13205]: I0319 09:25:14.236690 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-rvwfh_03d12dab-1215-4c1f-a9f5-27ea7174d308/kube-rbac-proxy/0.log" Mar 19 09:25:14.317401 master-0 kubenswrapper[13205]: I0319 09:25:14.317278 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgtgw\" (UniqueName: \"kubernetes.io/projected/8beda3a0-a653-4810-b3f2-d25badb21ab1-kube-api-access-tgtgw\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:14.317401 master-0 kubenswrapper[13205]: I0319 09:25:14.317327 13205 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8beda3a0-a653-4810-b3f2-d25badb21ab1-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:14.638943 master-0 kubenswrapper[13205]: I0319 09:25:14.638881 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-8b68b9d9b-7qnf9_e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/kube-apiserver-operator/0.log" Mar 19 09:25:14.837486 master-0 kubenswrapper[13205]: I0319 09:25:14.837349 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-8b68b9d9b-7qnf9_e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/kube-apiserver-operator/1.log" Mar 19 09:25:14.931703 master-0 kubenswrapper[13205]: I0319 09:25:14.931392 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-rsnsn_6904be4c-4f5f-4176-8100-7b6955c6d8da/cluster-autoscaler-operator/0.log" Mar 19 09:25:14.932450 master-0 kubenswrapper[13205]: I0319 09:25:14.931815 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-rsnsn" event={"ID":"6904be4c-4f5f-4176-8100-7b6955c6d8da","Type":"ContainerStarted","Data":"b963e3b8d509d8435e0ac3f76e7032047cf5a5056edab6d99e3d7f93cc2c9066"} Mar 19 09:25:14.933618 master-0 kubenswrapper[13205]: I0319 09:25:14.933578 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-29cq2" event={"ID":"0f7b58ba-ff67-416a-880a-b7e0f9a6e35f","Type":"ContainerStarted","Data":"be79a856d30188829db928f9b3d3dd8a6e46081e47d8e58f45cb7e03d84c26fa"} Mar 19 09:25:14.937553 master-0 kubenswrapper[13205]: I0319 09:25:14.936755 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n" event={"ID":"b0bc69d1-1383-478f-9a9e-c23e88646056","Type":"ContainerStarted","Data":"9e048306c345e35e5c2f227c4b8e84c04fcc41a90b3bdf7d53d07323e4c7609a"} Mar 19 09:25:14.938118 master-0 kubenswrapper[13205]: I0319 09:25:14.938083 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n" Mar 19 09:25:14.939357 master-0 kubenswrapper[13205]: I0319 09:25:14.939312 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5dbbb8b86f-fvh8d_8beda3a0-a653-4810-b3f2-d25badb21ab1/multus-admission-controller/0.log" Mar 19 09:25:14.939514 master-0 kubenswrapper[13205]: I0319 09:25:14.939470 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" Mar 19 09:25:14.939810 master-0 kubenswrapper[13205]: I0319 09:25:14.939574 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d" event={"ID":"8beda3a0-a653-4810-b3f2-d25badb21ab1","Type":"ContainerDied","Data":"3d5b3f08e9980af7a4eb46a62a5af4211db365f64342fe1705f26fa41b7b1331"} Mar 19 09:25:14.944947 master-0 kubenswrapper[13205]: I0319 09:25:14.942619 13205 scope.go:117] "RemoveContainer" containerID="5a1232e74d2b81fa0fb089837e46ec811c58ea20165c36d4de9800956bf481df" Mar 19 09:25:14.951985 master-0 kubenswrapper[13205]: I0319 09:25:14.951932 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n" Mar 19 09:25:14.953365 master-0 kubenswrapper[13205]: I0319 09:25:14.953275 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kqb7f" event={"ID":"1519cbae-bedd-446b-8600-e5d040b98fc9","Type":"ContainerStarted","Data":"2c4611dac82066682322c85e04fe2dd4faae30c78c976ff8f9607f7ea675c4be"} Mar 19 09:25:14.967202 master-0 kubenswrapper[13205]: I0319 09:25:14.967158 13205 scope.go:117] "RemoveContainer" containerID="ec2a8f37c4a4bf290761beb86b8148cabc7c9a7b8241accf763dd14e9ad11acc" Mar 19 09:25:15.011825 master-0 kubenswrapper[13205]: I0319 09:25:15.011759 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d"] Mar 19 09:25:15.046082 master-0 kubenswrapper[13205]: I0319 09:25:15.046032 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-fvh8d"] Mar 19 09:25:15.047848 master-0 kubenswrapper[13205]: I0319 09:25:15.047828 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_434aabfa-50db-407e-92d3-a034696613e3/installer/0.log" Mar 19 09:25:15.109711 master-0 kubenswrapper[13205]: I0319 09:25:15.108140 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-kqb7f" podStartSLOduration=2.10811944 podStartE2EDuration="2.10811944s" podCreationTimestamp="2026-03-19 09:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:15.106338085 +0000 UTC m=+100.438644993" watchObservedRunningTime="2026-03-19 09:25:15.10811944 +0000 UTC m=+100.440426338" Mar 19 09:25:15.109711 master-0 kubenswrapper[13205]: I0319 09:25:15.109101 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-7wt9n" podStartSLOduration=46.7211701 podStartE2EDuration="50.109093003s" podCreationTimestamp="2026-03-19 09:24:25 +0000 UTC" firstStartedPulling="2026-03-19 09:25:10.422934126 +0000 UTC m=+95.755241014" lastFinishedPulling="2026-03-19 09:25:13.810857029 +0000 UTC m=+99.143163917" observedRunningTime="2026-03-19 09:25:15.073253361 +0000 UTC m=+100.405560249" watchObservedRunningTime="2026-03-19 09:25:15.109093003 +0000 UTC m=+100.441399891" Mar 19 09:25:15.171362 master-0 kubenswrapper[13205]: I0319 09:25:15.171158 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7dcf5569b5-29cq2" podStartSLOduration=49.954214538 podStartE2EDuration="53.17113894s" podCreationTimestamp="2026-03-19 09:24:22 +0000 UTC" firstStartedPulling="2026-03-19 09:25:10.652404255 +0000 UTC m=+95.984711143" lastFinishedPulling="2026-03-19 09:25:13.869328667 +0000 UTC m=+99.201635545" observedRunningTime="2026-03-19 09:25:15.169299664 +0000 UTC m=+100.501606552" watchObservedRunningTime="2026-03-19 09:25:15.17113894 +0000 UTC m=+100.503445848" Mar 19 09:25:15.278548 master-0 kubenswrapper[13205]: I0319 09:25:15.271165 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-retry-1-master-0_ff98fb1e-7a1f-4657-b085-743d6f2d28e2/installer/0.log" Mar 19 09:25:15.586808 master-0 kubenswrapper[13205]: I0319 09:25:15.586624 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:15.590799 master-0 kubenswrapper[13205]: I0319 09:25:15.590754 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:15.966717 master-0 kubenswrapper[13205]: I0319 09:25:15.965510 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:15.968422 master-0 kubenswrapper[13205]: I0319 09:25:15.968353 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7dcf5569b5-29cq2" Mar 19 09:25:16.866785 master-0 kubenswrapper[13205]: I0319 09:25:16.863230 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8beda3a0-a653-4810-b3f2-d25badb21ab1" path="/var/lib/kubelet/pods/8beda3a0-a653-4810-b3f2-d25badb21ab1/volumes" Mar 19 09:25:17.457712 master-0 kubenswrapper[13205]: I0319 09:25:17.457660 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_ac3507630eeeca1ec26dca5ed036e3bb/setup/0.log" Mar 19 09:25:18.385421 master-0 kubenswrapper[13205]: I0319 09:25:18.385372 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_ac3507630eeeca1ec26dca5ed036e3bb/kube-apiserver/0.log" Mar 19 09:25:18.391324 master-0 kubenswrapper[13205]: I0319 09:25:18.391279 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 09:25:18.393490 master-0 kubenswrapper[13205]: I0319 09:25:18.393438 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-master-0" podUID="155018d1-af14-4adc-b7a0-cab0133dd65f" containerName="installer" containerID="cri-o://28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c" gracePeriod=30 Mar 19 09:25:18.472371 master-0 kubenswrapper[13205]: I0319 09:25:18.472313 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-xcb24"] Mar 19 09:25:18.472948 master-0 kubenswrapper[13205]: E0319 09:25:18.472592 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8beda3a0-a653-4810-b3f2-d25badb21ab1" containerName="multus-admission-controller" Mar 19 09:25:18.472948 master-0 kubenswrapper[13205]: I0319 09:25:18.472608 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="8beda3a0-a653-4810-b3f2-d25badb21ab1" containerName="multus-admission-controller" Mar 19 09:25:18.472948 master-0 kubenswrapper[13205]: E0319 09:25:18.472624 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8beda3a0-a653-4810-b3f2-d25badb21ab1" containerName="kube-rbac-proxy" Mar 19 09:25:18.472948 master-0 kubenswrapper[13205]: I0319 09:25:18.472632 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="8beda3a0-a653-4810-b3f2-d25badb21ab1" containerName="kube-rbac-proxy" Mar 19 09:25:18.472948 master-0 kubenswrapper[13205]: I0319 09:25:18.472759 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="8beda3a0-a653-4810-b3f2-d25badb21ab1" containerName="kube-rbac-proxy" Mar 19 09:25:18.472948 master-0 kubenswrapper[13205]: I0319 09:25:18.472786 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="8beda3a0-a653-4810-b3f2-d25badb21ab1" containerName="multus-admission-controller" Mar 19 09:25:18.473351 master-0 kubenswrapper[13205]: I0319 09:25:18.473310 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.477041 master-0 kubenswrapper[13205]: I0319 09:25:18.476995 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 09:25:18.477236 master-0 kubenswrapper[13205]: I0319 09:25:18.477212 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 09:25:18.478128 master-0 kubenswrapper[13205]: I0319 09:25:18.478104 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 09:25:18.478549 master-0 kubenswrapper[13205]: I0319 09:25:18.478491 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-2zxtq" Mar 19 09:25:18.480956 master-0 kubenswrapper[13205]: I0319 09:25:18.480916 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 09:25:18.495065 master-0 kubenswrapper[13205]: I0319 09:25:18.494937 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 09:25:18.544081 master-0 kubenswrapper[13205]: I0319 09:25:18.538481 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-xcb24"] Mar 19 09:25:18.544290 master-0 kubenswrapper[13205]: I0319 09:25:18.544166 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_ac3507630eeeca1ec26dca5ed036e3bb/kube-apiserver-cert-syncer/0.log" Mar 19 09:25:18.576017 master-0 kubenswrapper[13205]: I0319 09:25:18.575978 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0-serving-cert\") pod \"console-operator-76b6568d85-xcb24\" (UID: \"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0\") " pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.576205 master-0 kubenswrapper[13205]: I0319 09:25:18.576040 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bjb8\" (UniqueName: \"kubernetes.io/projected/7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0-kube-api-access-4bjb8\") pod \"console-operator-76b6568d85-xcb24\" (UID: \"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0\") " pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.576238 master-0 kubenswrapper[13205]: I0319 09:25:18.576181 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0-trusted-ca\") pod \"console-operator-76b6568d85-xcb24\" (UID: \"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0\") " pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.576290 master-0 kubenswrapper[13205]: I0319 09:25:18.576261 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0-config\") pod \"console-operator-76b6568d85-xcb24\" (UID: \"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0\") " pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.640518 master-0 kubenswrapper[13205]: I0319 09:25:18.640372 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_ac3507630eeeca1ec26dca5ed036e3bb/kube-apiserver-cert-regeneration-controller/0.log" Mar 19 09:25:18.677065 master-0 kubenswrapper[13205]: I0319 09:25:18.677022 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0-serving-cert\") pod \"console-operator-76b6568d85-xcb24\" (UID: \"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0\") " pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.677065 master-0 kubenswrapper[13205]: I0319 09:25:18.677067 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bjb8\" (UniqueName: \"kubernetes.io/projected/7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0-kube-api-access-4bjb8\") pod \"console-operator-76b6568d85-xcb24\" (UID: \"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0\") " pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.677299 master-0 kubenswrapper[13205]: I0319 09:25:18.677087 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0-trusted-ca\") pod \"console-operator-76b6568d85-xcb24\" (UID: \"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0\") " pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.677341 master-0 kubenswrapper[13205]: I0319 09:25:18.677293 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0-config\") pod \"console-operator-76b6568d85-xcb24\" (UID: \"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0\") " pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.678113 master-0 kubenswrapper[13205]: I0319 09:25:18.678090 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0-config\") pod \"console-operator-76b6568d85-xcb24\" (UID: \"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0\") " pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.678199 master-0 kubenswrapper[13205]: I0319 09:25:18.678179 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0-trusted-ca\") pod \"console-operator-76b6568d85-xcb24\" (UID: \"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0\") " pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.680142 master-0 kubenswrapper[13205]: I0319 09:25:18.680104 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0-serving-cert\") pod \"console-operator-76b6568d85-xcb24\" (UID: \"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0\") " pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.718679 master-0 kubenswrapper[13205]: I0319 09:25:18.718605 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_ac3507630eeeca1ec26dca5ed036e3bb/kube-apiserver-insecure-readyz/0.log" Mar 19 09:25:18.732699 master-0 kubenswrapper[13205]: I0319 09:25:18.732647 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bjb8\" (UniqueName: \"kubernetes.io/projected/7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0-kube-api-access-4bjb8\") pod \"console-operator-76b6568d85-xcb24\" (UID: \"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0\") " pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.761150 master-0 kubenswrapper[13205]: I0319 09:25:18.761107 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_ac3507630eeeca1ec26dca5ed036e3bb/kube-apiserver-check-endpoints/0.log" Mar 19 09:25:18.769378 master-0 kubenswrapper[13205]: I0319 09:25:18.769346 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_43ca4232-9e9c-4b97-9c29-bead80a9a5fa/installer/0.log" Mar 19 09:25:18.792257 master-0 kubenswrapper[13205]: I0319 09:25:18.792208 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:18.798549 master-0 kubenswrapper[13205]: I0319 09:25:18.798494 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/0.log" Mar 19 09:25:18.869802 master-0 kubenswrapper[13205]: I0319 09:25:18.869142 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld"] Mar 19 09:25:18.870224 master-0 kubenswrapper[13205]: I0319 09:25:18.870188 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:18.876666 master-0 kubenswrapper[13205]: I0319 09:25:18.873387 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-92z97" Mar 19 09:25:18.876666 master-0 kubenswrapper[13205]: I0319 09:25:18.873459 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 09:25:18.876666 master-0 kubenswrapper[13205]: I0319 09:25:18.873761 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 09:25:18.876666 master-0 kubenswrapper[13205]: I0319 09:25:18.873832 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 09:25:18.980720 master-0 kubenswrapper[13205]: I0319 09:25:18.980642 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/793e12a6-caff-4738-96ad-da1377e09fe8-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-9lpld\" (UID: \"793e12a6-caff-4738-96ad-da1377e09fe8\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:18.980883 master-0 kubenswrapper[13205]: I0319 09:25:18.980803 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/793e12a6-caff-4738-96ad-da1377e09fe8-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-9lpld\" (UID: \"793e12a6-caff-4738-96ad-da1377e09fe8\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:18.981146 master-0 kubenswrapper[13205]: I0319 09:25:18.981104 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/793e12a6-caff-4738-96ad-da1377e09fe8-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-9lpld\" (UID: \"793e12a6-caff-4738-96ad-da1377e09fe8\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:18.981249 master-0 kubenswrapper[13205]: I0319 09:25:18.981218 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgshf\" (UniqueName: \"kubernetes.io/projected/793e12a6-caff-4738-96ad-da1377e09fe8-kube-api-access-pgshf\") pod \"prometheus-operator-6c8df6d4b-9lpld\" (UID: \"793e12a6-caff-4738-96ad-da1377e09fe8\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:19.083232 master-0 kubenswrapper[13205]: I0319 09:25:19.083143 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/793e12a6-caff-4738-96ad-da1377e09fe8-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-9lpld\" (UID: \"793e12a6-caff-4738-96ad-da1377e09fe8\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:19.083499 master-0 kubenswrapper[13205]: I0319 09:25:19.083445 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgshf\" (UniqueName: \"kubernetes.io/projected/793e12a6-caff-4738-96ad-da1377e09fe8-kube-api-access-pgshf\") pod \"prometheus-operator-6c8df6d4b-9lpld\" (UID: \"793e12a6-caff-4738-96ad-da1377e09fe8\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:19.083830 master-0 kubenswrapper[13205]: I0319 09:25:19.083763 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/793e12a6-caff-4738-96ad-da1377e09fe8-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-9lpld\" (UID: \"793e12a6-caff-4738-96ad-da1377e09fe8\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:19.083890 master-0 kubenswrapper[13205]: I0319 09:25:19.083876 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/793e12a6-caff-4738-96ad-da1377e09fe8-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-9lpld\" (UID: \"793e12a6-caff-4738-96ad-da1377e09fe8\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:19.084831 master-0 kubenswrapper[13205]: I0319 09:25:19.084793 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/793e12a6-caff-4738-96ad-da1377e09fe8-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-9lpld\" (UID: \"793e12a6-caff-4738-96ad-da1377e09fe8\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:19.087037 master-0 kubenswrapper[13205]: I0319 09:25:19.086989 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/793e12a6-caff-4738-96ad-da1377e09fe8-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-9lpld\" (UID: \"793e12a6-caff-4738-96ad-da1377e09fe8\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:19.089094 master-0 kubenswrapper[13205]: I0319 09:25:19.089062 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/793e12a6-caff-4738-96ad-da1377e09fe8-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-9lpld\" (UID: \"793e12a6-caff-4738-96ad-da1377e09fe8\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:19.111583 master-0 kubenswrapper[13205]: I0319 09:25:19.111504 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld"] Mar 19 09:25:19.204918 master-0 kubenswrapper[13205]: I0319 09:25:19.204788 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgshf\" (UniqueName: \"kubernetes.io/projected/793e12a6-caff-4738-96ad-da1377e09fe8-kube-api-access-pgshf\") pod \"prometheus-operator-6c8df6d4b-9lpld\" (UID: \"793e12a6-caff-4738-96ad-da1377e09fe8\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:19.343716 master-0 kubenswrapper[13205]: I0319 09:25:19.343669 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/1.log" Mar 19 09:25:19.362149 master-0 kubenswrapper[13205]: I0319 09:25:19.362114 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-xcb24"] Mar 19 09:25:19.366680 master-0 kubenswrapper[13205]: W0319 09:25:19.366636 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d185b6f_c2ea_4570_a9a0_9b2562e0a2b0.slice/crio-de4bf8fac6daa21e558198b17151a7cccc01fb1051727183ba9e14c2614033ef WatchSource:0}: Error finding container de4bf8fac6daa21e558198b17151a7cccc01fb1051727183ba9e14c2614033ef: Status 404 returned error can't find the container with id de4bf8fac6daa21e558198b17151a7cccc01fb1051727183ba9e14c2614033ef Mar 19 09:25:19.492404 master-0 kubenswrapper[13205]: I0319 09:25:19.492253 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" Mar 19 09:25:19.551569 master-0 kubenswrapper[13205]: I0319 09:25:19.549765 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/0.log" Mar 19 09:25:19.611150 master-0 kubenswrapper[13205]: I0319 09:25:19.611086 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager-cert-syncer/0.log" Mar 19 09:25:19.636101 master-0 kubenswrapper[13205]: I0319 09:25:19.636028 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager-recovery-controller/0.log" Mar 19 09:25:19.663318 master-0 kubenswrapper[13205]: I0319 09:25:19.663259 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-pvlq6_f0c75102-6790-4ed3-84da-61c3611186f8/kube-controller-manager-operator/0.log" Mar 19 09:25:19.673849 master-0 kubenswrapper[13205]: I0319 09:25:19.673723 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-pvlq6_f0c75102-6790-4ed3-84da-61c3611186f8/kube-controller-manager-operator/1.log" Mar 19 09:25:19.685314 master-0 kubenswrapper[13205]: I0319 09:25:19.685266 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_014ef8bd-b940-41e2-9239-c238afe6ebae/installer/0.log" Mar 19 09:25:19.805495 master-0 kubenswrapper[13205]: I0319 09:25:19.805321 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/wait-for-host-port/0.log" Mar 19 09:25:19.920971 master-0 kubenswrapper[13205]: I0319 09:25:19.920888 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld"] Mar 19 09:25:20.006974 master-0 kubenswrapper[13205]: I0319 09:25:20.006914 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" event={"ID":"793e12a6-caff-4738-96ad-da1377e09fe8","Type":"ContainerStarted","Data":"b052a521e50cbf1492a64eb24863e39fb832ecb298e58405e8f7bc7a26accaa8"} Mar 19 09:25:20.007987 master-0 kubenswrapper[13205]: I0319 09:25:20.007947 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-xcb24" event={"ID":"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0","Type":"ContainerStarted","Data":"de4bf8fac6daa21e558198b17151a7cccc01fb1051727183ba9e14c2614033ef"} Mar 19 09:25:20.009390 master-0 kubenswrapper[13205]: I0319 09:25:20.009372 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler/0.log" Mar 19 09:25:20.270450 master-0 kubenswrapper[13205]: I0319 09:25:20.270348 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/0.log" Mar 19 09:25:20.405661 master-0 kubenswrapper[13205]: I0319 09:25:20.405568 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-recovery-controller/0.log" Mar 19 09:25:20.611763 master-0 kubenswrapper[13205]: I0319 09:25:20.611507 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-dddff6458-zddz9_d664acc4-ec4f-4078-ae93-404a14ea18fc/kube-scheduler-operator-container/0.log" Mar 19 09:25:20.804604 master-0 kubenswrapper[13205]: I0319 09:25:20.804515 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-dddff6458-zddz9_d664acc4-ec4f-4078-ae93-404a14ea18fc/kube-scheduler-operator-container/1.log" Mar 19 09:25:21.002978 master-0 kubenswrapper[13205]: I0319 09:25:21.002771 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/egress-router-binary-copy/0.log" Mar 19 09:25:21.202233 master-0 kubenswrapper[13205]: I0319 09:25:21.202146 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/cni-plugins/0.log" Mar 19 09:25:21.410238 master-0 kubenswrapper[13205]: I0319 09:25:21.410068 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/bond-cni-plugin/0.log" Mar 19 09:25:21.601184 master-0 kubenswrapper[13205]: I0319 09:25:21.601155 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/routeoverride-cni/0.log" Mar 19 09:25:21.802979 master-0 kubenswrapper[13205]: I0319 09:25:21.802915 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/whereabouts-cni-bincopy/0.log" Mar 19 09:25:21.858545 master-0 kubenswrapper[13205]: I0319 09:25:21.858485 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 09:25:21.859377 master-0 kubenswrapper[13205]: I0319 09:25:21.859351 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:25:21.869743 master-0 kubenswrapper[13205]: I0319 09:25:21.869130 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 09:25:21.926771 master-0 kubenswrapper[13205]: I0319 09:25:21.926722 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f12c099-d9a7-48a9-9965-c339c4e32d31-var-lock\") pod \"installer-3-master-0\" (UID: \"0f12c099-d9a7-48a9-9965-c339c4e32d31\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:25:21.927391 master-0 kubenswrapper[13205]: I0319 09:25:21.927344 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f12c099-d9a7-48a9-9965-c339c4e32d31-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"0f12c099-d9a7-48a9-9965-c339c4e32d31\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:25:21.927805 master-0 kubenswrapper[13205]: I0319 09:25:21.927785 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f12c099-d9a7-48a9-9965-c339c4e32d31-kube-api-access\") pod \"installer-3-master-0\" (UID: \"0f12c099-d9a7-48a9-9965-c339c4e32d31\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:25:22.001314 master-0 kubenswrapper[13205]: I0319 09:25:22.001275 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/whereabouts-cni/0.log" Mar 19 09:25:22.028614 master-0 kubenswrapper[13205]: I0319 09:25:22.028566 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f12c099-d9a7-48a9-9965-c339c4e32d31-var-lock\") pod \"installer-3-master-0\" (UID: \"0f12c099-d9a7-48a9-9965-c339c4e32d31\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:25:22.028813 master-0 kubenswrapper[13205]: I0319 09:25:22.028689 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f12c099-d9a7-48a9-9965-c339c4e32d31-var-lock\") pod \"installer-3-master-0\" (UID: \"0f12c099-d9a7-48a9-9965-c339c4e32d31\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:25:22.028813 master-0 kubenswrapper[13205]: I0319 09:25:22.028740 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f12c099-d9a7-48a9-9965-c339c4e32d31-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"0f12c099-d9a7-48a9-9965-c339c4e32d31\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:25:22.028871 master-0 kubenswrapper[13205]: I0319 09:25:22.028834 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f12c099-d9a7-48a9-9965-c339c4e32d31-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"0f12c099-d9a7-48a9-9965-c339c4e32d31\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:25:22.028909 master-0 kubenswrapper[13205]: I0319 09:25:22.028898 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f12c099-d9a7-48a9-9965-c339c4e32d31-kube-api-access\") pod \"installer-3-master-0\" (UID: \"0f12c099-d9a7-48a9-9965-c339c4e32d31\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:25:22.046712 master-0 kubenswrapper[13205]: I0319 09:25:22.046667 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f12c099-d9a7-48a9-9965-c339c4e32d31-kube-api-access\") pod \"installer-3-master-0\" (UID: \"0f12c099-d9a7-48a9-9965-c339c4e32d31\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:25:22.184611 master-0 kubenswrapper[13205]: I0319 09:25:22.184129 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:25:22.202237 master-0 kubenswrapper[13205]: I0319 09:25:22.202186 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/kube-multus-additional-cni-plugins/0.log" Mar 19 09:25:22.803483 master-0 kubenswrapper[13205]: I0319 09:25:22.803443 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-64f78496dd-kwdfq_165e3498-b49e-42fa-a614-0680f8c93fc7/multus-admission-controller/0.log" Mar 19 09:25:22.888218 master-0 kubenswrapper[13205]: I0319 09:25:22.888155 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 09:25:23.002705 master-0 kubenswrapper[13205]: I0319 09:25:23.002577 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-64f78496dd-kwdfq_165e3498-b49e-42fa-a614-0680f8c93fc7/kube-rbac-proxy/0.log" Mar 19 09:25:23.034713 master-0 kubenswrapper[13205]: I0319 09:25:23.032948 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" event={"ID":"793e12a6-caff-4738-96ad-da1377e09fe8","Type":"ContainerStarted","Data":"b64db200dcfda5566d8b5fc65160c1b2dd69a44c96c6c7b13890133da2f4aa24"} Mar 19 09:25:23.034713 master-0 kubenswrapper[13205]: I0319 09:25:23.033022 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" event={"ID":"793e12a6-caff-4738-96ad-da1377e09fe8","Type":"ContainerStarted","Data":"3c3914ab957cd588bd961eecccc0367975a43649b348d49076b984042deec4e5"} Mar 19 09:25:23.036628 master-0 kubenswrapper[13205]: I0319 09:25:23.036582 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-xcb24" event={"ID":"7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0","Type":"ContainerStarted","Data":"a63d873b271bcf78ed82594b5191f7ae3887f9d1b2535c6fe9a0c8ef4014a4bf"} Mar 19 09:25:23.036889 master-0 kubenswrapper[13205]: I0319 09:25:23.036843 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:23.039030 master-0 kubenswrapper[13205]: I0319 09:25:23.039001 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"0f12c099-d9a7-48a9-9965-c339c4e32d31","Type":"ContainerStarted","Data":"d4d4e5484af07923bd4b1ed02e7c532a21065e50ad6a24eb1106acca6ea29449"} Mar 19 09:25:23.057145 master-0 kubenswrapper[13205]: I0319 09:25:23.057050 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-9lpld" podStartSLOduration=2.530614007 podStartE2EDuration="5.05703557s" podCreationTimestamp="2026-03-19 09:25:18 +0000 UTC" firstStartedPulling="2026-03-19 09:25:19.933239249 +0000 UTC m=+105.265546137" lastFinishedPulling="2026-03-19 09:25:22.459660812 +0000 UTC m=+107.791967700" observedRunningTime="2026-03-19 09:25:23.056332792 +0000 UTC m=+108.388639680" watchObservedRunningTime="2026-03-19 09:25:23.05703557 +0000 UTC m=+108.389342458" Mar 19 09:25:23.080952 master-0 kubenswrapper[13205]: I0319 09:25:23.080667 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-76b6568d85-xcb24" podStartSLOduration=1.990278831 podStartE2EDuration="5.080645108s" podCreationTimestamp="2026-03-19 09:25:18 +0000 UTC" firstStartedPulling="2026-03-19 09:25:19.369596222 +0000 UTC m=+104.701903120" lastFinishedPulling="2026-03-19 09:25:22.459962509 +0000 UTC m=+107.792269397" observedRunningTime="2026-03-19 09:25:23.07872581 +0000 UTC m=+108.411032698" watchObservedRunningTime="2026-03-19 09:25:23.080645108 +0000 UTC m=+108.412951996" Mar 19 09:25:23.171027 master-0 kubenswrapper[13205]: I0319 09:25:23.170972 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-76b6568d85-xcb24" Mar 19 09:25:23.209554 master-0 kubenswrapper[13205]: I0319 09:25:23.208382 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzdzd_157e3524-eb27-41ca-b49d-2697ee1245ca/kube-multus/0.log" Mar 19 09:25:23.361604 master-0 kubenswrapper[13205]: I0319 09:25:23.361231 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-66b8ffb895-hfq5p"] Mar 19 09:25:23.362078 master-0 kubenswrapper[13205]: I0319 09:25:23.362031 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-66b8ffb895-hfq5p" Mar 19 09:25:23.365106 master-0 kubenswrapper[13205]: I0319 09:25:23.365066 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 09:25:23.365314 master-0 kubenswrapper[13205]: I0319 09:25:23.365288 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 09:25:23.365452 master-0 kubenswrapper[13205]: I0319 09:25:23.365429 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-rcdtx" Mar 19 09:25:23.385733 master-0 kubenswrapper[13205]: I0319 09:25:23.385672 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-66b8ffb895-hfq5p"] Mar 19 09:25:23.406676 master-0 kubenswrapper[13205]: I0319 09:25:23.406628 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzdzd_157e3524-eb27-41ca-b49d-2697ee1245ca/kube-multus/1.log" Mar 19 09:25:23.472487 master-0 kubenswrapper[13205]: I0319 09:25:23.472411 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzzrk\" (UniqueName: \"kubernetes.io/projected/437ab63c-8bc0-4761-81fd-0da0052a9628-kube-api-access-xzzrk\") pod \"downloads-66b8ffb895-hfq5p\" (UID: \"437ab63c-8bc0-4761-81fd-0da0052a9628\") " pod="openshift-console/downloads-66b8ffb895-hfq5p" Mar 19 09:25:23.573697 master-0 kubenswrapper[13205]: I0319 09:25:23.573636 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzzrk\" (UniqueName: \"kubernetes.io/projected/437ab63c-8bc0-4761-81fd-0da0052a9628-kube-api-access-xzzrk\") pod \"downloads-66b8ffb895-hfq5p\" (UID: \"437ab63c-8bc0-4761-81fd-0da0052a9628\") " pod="openshift-console/downloads-66b8ffb895-hfq5p" Mar 19 09:25:23.592611 master-0 kubenswrapper[13205]: I0319 09:25:23.591246 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzzrk\" (UniqueName: \"kubernetes.io/projected/437ab63c-8bc0-4761-81fd-0da0052a9628-kube-api-access-xzzrk\") pod \"downloads-66b8ffb895-hfq5p\" (UID: \"437ab63c-8bc0-4761-81fd-0da0052a9628\") " pod="openshift-console/downloads-66b8ffb895-hfq5p" Mar 19 09:25:23.606427 master-0 kubenswrapper[13205]: I0319 09:25:23.606384 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nq9vs_13072c08-c77c-4170-9ebe-98d63968747b/network-metrics-daemon/0.log" Mar 19 09:25:23.693608 master-0 kubenswrapper[13205]: I0319 09:25:23.690896 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-66b8ffb895-hfq5p" Mar 19 09:25:23.809558 master-0 kubenswrapper[13205]: I0319 09:25:23.802152 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nq9vs_13072c08-c77c-4170-9ebe-98d63968747b/kube-rbac-proxy/0.log" Mar 19 09:25:24.004731 master-0 kubenswrapper[13205]: I0319 09:25:24.004341 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-control-plane-57f769d897-hcnr7_41659a48-5eea-41cd-8b2a-b683dc15cc11/kube-rbac-proxy/0.log" Mar 19 09:25:24.052414 master-0 kubenswrapper[13205]: I0319 09:25:24.051390 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"0f12c099-d9a7-48a9-9965-c339c4e32d31","Type":"ContainerStarted","Data":"d5263ec6b799cc073e1945d22bbc2ea8e25dd090a5b022d429fcdd2f5e70a626"} Mar 19 09:25:24.075876 master-0 kubenswrapper[13205]: I0319 09:25:24.075798 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=3.075773768 podStartE2EDuration="3.075773768s" podCreationTimestamp="2026-03-19 09:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:24.070556339 +0000 UTC m=+109.402863237" watchObservedRunningTime="2026-03-19 09:25:24.075773768 +0000 UTC m=+109.408080656" Mar 19 09:25:24.102020 master-0 kubenswrapper[13205]: I0319 09:25:24.101965 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-66b8ffb895-hfq5p"] Mar 19 09:25:24.109015 master-0 kubenswrapper[13205]: W0319 09:25:24.108973 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ab63c_8bc0_4761_81fd_0da0052a9628.slice/crio-c445887189f0624c76962b8977d02d60499ab06b3ecee6519293ad8c1ec7fb9c WatchSource:0}: Error finding container c445887189f0624c76962b8977d02d60499ab06b3ecee6519293ad8c1ec7fb9c: Status 404 returned error can't find the container with id c445887189f0624c76962b8977d02d60499ab06b3ecee6519293ad8c1ec7fb9c Mar 19 09:25:24.210634 master-0 kubenswrapper[13205]: I0319 09:25:24.208297 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-control-plane-57f769d897-hcnr7_41659a48-5eea-41cd-8b2a-b683dc15cc11/ovnkube-cluster-manager/0.log" Mar 19 09:25:24.466052 master-0 kubenswrapper[13205]: I0319 09:25:24.465991 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vcxjs_d9eb3750-cb7b-4d3c-88bc-d1b68a370872/kubecfg-setup/0.log" Mar 19 09:25:24.761470 master-0 kubenswrapper[13205]: I0319 09:25:24.761355 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vcxjs_d9eb3750-cb7b-4d3c-88bc-d1b68a370872/ovn-controller/0.log" Mar 19 09:25:24.808604 master-0 kubenswrapper[13205]: I0319 09:25:24.808516 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vcxjs_d9eb3750-cb7b-4d3c-88bc-d1b68a370872/ovn-acl-logging/0.log" Mar 19 09:25:25.057994 master-0 kubenswrapper[13205]: I0319 09:25:25.057940 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-66b8ffb895-hfq5p" event={"ID":"437ab63c-8bc0-4761-81fd-0da0052a9628","Type":"ContainerStarted","Data":"c445887189f0624c76962b8977d02d60499ab06b3ecee6519293ad8c1ec7fb9c"} Mar 19 09:25:25.094736 master-0 kubenswrapper[13205]: I0319 09:25:25.094692 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:25:25.098717 master-0 kubenswrapper[13205]: I0319 09:25:25.098610 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/745093e5-ffe1-4443-b317-448948f3b311-metrics-tls\") pod \"dns-default-79jrh\" (UID: \"745093e5-ffe1-4443-b317-448948f3b311\") " pod="openshift-dns/dns-default-79jrh" Mar 19 09:25:25.245945 master-0 kubenswrapper[13205]: I0319 09:25:25.245902 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-79jrh" Mar 19 09:25:25.624081 master-0 kubenswrapper[13205]: I0319 09:25:25.623961 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vcxjs_d9eb3750-cb7b-4d3c-88bc-d1b68a370872/kube-rbac-proxy-node/0.log" Mar 19 09:25:25.772388 master-0 kubenswrapper[13205]: I0319 09:25:25.772324 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-79jrh"] Mar 19 09:25:25.774491 master-0 kubenswrapper[13205]: I0319 09:25:25.774458 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vcxjs_d9eb3750-cb7b-4d3c-88bc-d1b68a370872/kube-rbac-proxy-ovn-metrics/0.log" Mar 19 09:25:25.785193 master-0 kubenswrapper[13205]: W0319 09:25:25.785133 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod745093e5_ffe1_4443_b317_448948f3b311.slice/crio-5cdc2f52ac7ce5d2553ae4f31ba647ca8f1149441be76dca799c49fc580f4722 WatchSource:0}: Error finding container 5cdc2f52ac7ce5d2553ae4f31ba647ca8f1149441be76dca799c49fc580f4722: Status 404 returned error can't find the container with id 5cdc2f52ac7ce5d2553ae4f31ba647ca8f1149441be76dca799c49fc580f4722 Mar 19 09:25:25.936182 master-0 kubenswrapper[13205]: I0319 09:25:25.935314 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vcxjs_d9eb3750-cb7b-4d3c-88bc-d1b68a370872/northd/0.log" Mar 19 09:25:26.067957 master-0 kubenswrapper[13205]: I0319 09:25:26.067889 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-79jrh" event={"ID":"745093e5-ffe1-4443-b317-448948f3b311","Type":"ContainerStarted","Data":"5cdc2f52ac7ce5d2553ae4f31ba647ca8f1149441be76dca799c49fc580f4722"} Mar 19 09:25:26.113064 master-0 kubenswrapper[13205]: I0319 09:25:26.112590 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vcxjs_d9eb3750-cb7b-4d3c-88bc-d1b68a370872/nbdb/0.log" Mar 19 09:25:26.131810 master-0 kubenswrapper[13205]: I0319 09:25:26.131758 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vcxjs_d9eb3750-cb7b-4d3c-88bc-d1b68a370872/sbdb/0.log" Mar 19 09:25:26.146919 master-0 kubenswrapper[13205]: I0319 09:25:26.146869 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vcxjs_d9eb3750-cb7b-4d3c-88bc-d1b68a370872/ovnkube-controller/0.log" Mar 19 09:25:26.216915 master-0 kubenswrapper[13205]: I0319 09:25:26.216769 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-b4bf74f6-wwd24_e71d0aae-5e50-45f9-a0f2-8f3c3357976a/check-endpoints/0.log" Mar 19 09:25:26.444549 master-0 kubenswrapper[13205]: I0319 09:25:26.426787 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4s5vc_10c609bb-136a-4ce2-b9e2-0a03e1a37a62/network-check-target-container/0.log" Mar 19 09:25:26.947347 master-0 kubenswrapper[13205]: I0319 09:25:26.947296 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-slmgx_58ea8fcc-29b2-48ef-8629-2ba217c9d70c/approver/0.log" Mar 19 09:25:27.088071 master-0 kubenswrapper[13205]: I0319 09:25:27.087020 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-slmgx_58ea8fcc-29b2-48ef-8629-2ba217c9d70c/webhook/0.log" Mar 19 09:25:27.089034 master-0 kubenswrapper[13205]: I0319 09:25:27.088164 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4"] Mar 19 09:25:27.090016 master-0 kubenswrapper[13205]: I0319 09:25:27.089979 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.092273 master-0 kubenswrapper[13205]: I0319 09:25:27.092234 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 09:25:27.092411 master-0 kubenswrapper[13205]: I0319 09:25:27.092386 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-pvd7b" Mar 19 09:25:27.092804 master-0 kubenswrapper[13205]: I0319 09:25:27.092773 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 09:25:27.138955 master-0 kubenswrapper[13205]: I0319 09:25:27.136429 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46a87945-656e-4154-9235-644a90bffe83-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-6kvc4\" (UID: \"46a87945-656e-4154-9235-644a90bffe83\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.138955 master-0 kubenswrapper[13205]: I0319 09:25:27.136554 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/46a87945-656e-4154-9235-644a90bffe83-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-6kvc4\" (UID: \"46a87945-656e-4154-9235-644a90bffe83\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.138955 master-0 kubenswrapper[13205]: I0319 09:25:27.136614 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46a87945-656e-4154-9235-644a90bffe83-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-6kvc4\" (UID: \"46a87945-656e-4154-9235-644a90bffe83\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.138955 master-0 kubenswrapper[13205]: I0319 09:25:27.136638 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4km5\" (UniqueName: \"kubernetes.io/projected/46a87945-656e-4154-9235-644a90bffe83-kube-api-access-w4km5\") pod \"openshift-state-metrics-5dc6c74576-6kvc4\" (UID: \"46a87945-656e-4154-9235-644a90bffe83\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.140432 master-0 kubenswrapper[13205]: I0319 09:25:27.140390 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pfzk7"] Mar 19 09:25:27.141413 master-0 kubenswrapper[13205]: I0319 09:25:27.141378 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4"] Mar 19 09:25:27.141500 master-0 kubenswrapper[13205]: I0319 09:25:27.141470 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.143719 master-0 kubenswrapper[13205]: I0319 09:25:27.143684 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-7dv6h" Mar 19 09:25:27.143822 master-0 kubenswrapper[13205]: I0319 09:25:27.143790 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 09:25:27.143922 master-0 kubenswrapper[13205]: I0319 09:25:27.143897 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 09:25:27.162937 master-0 kubenswrapper[13205]: I0319 09:25:27.162124 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl"] Mar 19 09:25:27.163441 master-0 kubenswrapper[13205]: I0319 09:25:27.163407 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.168703 master-0 kubenswrapper[13205]: I0319 09:25:27.168577 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-fmrxp" Mar 19 09:25:27.168823 master-0 kubenswrapper[13205]: I0319 09:25:27.168758 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 09:25:27.168862 master-0 kubenswrapper[13205]: I0319 09:25:27.168817 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 09:25:27.168862 master-0 kubenswrapper[13205]: I0319 09:25:27.168854 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 09:25:27.180508 master-0 kubenswrapper[13205]: I0319 09:25:27.180477 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl"] Mar 19 09:25:27.238709 master-0 kubenswrapper[13205]: I0319 09:25:27.238637 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/86f98011-564c-4f08-8b8e-9d0518b77945-node-exporter-textfile\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.238709 master-0 kubenswrapper[13205]: I0319 09:25:27.238724 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/86f98011-564c-4f08-8b8e-9d0518b77945-node-exporter-wtmp\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.239103 master-0 kubenswrapper[13205]: I0319 09:25:27.238758 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/46a87945-656e-4154-9235-644a90bffe83-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-6kvc4\" (UID: \"46a87945-656e-4154-9235-644a90bffe83\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.239103 master-0 kubenswrapper[13205]: I0319 09:25:27.238801 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/86f98011-564c-4f08-8b8e-9d0518b77945-root\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.239103 master-0 kubenswrapper[13205]: I0319 09:25:27.238874 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86f98011-564c-4f08-8b8e-9d0518b77945-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.239103 master-0 kubenswrapper[13205]: I0319 09:25:27.238895 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.239103 master-0 kubenswrapper[13205]: I0319 09:25:27.238920 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.239103 master-0 kubenswrapper[13205]: I0319 09:25:27.238960 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/86f98011-564c-4f08-8b8e-9d0518b77945-node-exporter-tls\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.239103 master-0 kubenswrapper[13205]: I0319 09:25:27.238982 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46a87945-656e-4154-9235-644a90bffe83-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-6kvc4\" (UID: \"46a87945-656e-4154-9235-644a90bffe83\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.239103 master-0 kubenswrapper[13205]: I0319 09:25:27.239000 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4km5\" (UniqueName: \"kubernetes.io/projected/46a87945-656e-4154-9235-644a90bffe83-kube-api-access-w4km5\") pod \"openshift-state-metrics-5dc6c74576-6kvc4\" (UID: \"46a87945-656e-4154-9235-644a90bffe83\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.239103 master-0 kubenswrapper[13205]: I0319 09:25:27.239038 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.240342 master-0 kubenswrapper[13205]: I0319 09:25:27.240302 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46a87945-656e-4154-9235-644a90bffe83-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-6kvc4\" (UID: \"46a87945-656e-4154-9235-644a90bffe83\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.244763 master-0 kubenswrapper[13205]: I0319 09:25:27.244221 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/46a87945-656e-4154-9235-644a90bffe83-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-6kvc4\" (UID: \"46a87945-656e-4154-9235-644a90bffe83\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.248184 master-0 kubenswrapper[13205]: I0319 09:25:27.239578 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fjvm\" (UniqueName: \"kubernetes.io/projected/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-kube-api-access-6fjvm\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.248902 master-0 kubenswrapper[13205]: I0319 09:25:27.248848 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86f98011-564c-4f08-8b8e-9d0518b77945-sys\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.249081 master-0 kubenswrapper[13205]: I0319 09:25:27.249059 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4wpd\" (UniqueName: \"kubernetes.io/projected/86f98011-564c-4f08-8b8e-9d0518b77945-kube-api-access-t4wpd\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.249346 master-0 kubenswrapper[13205]: I0319 09:25:27.249323 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.249616 master-0 kubenswrapper[13205]: I0319 09:25:27.249512 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.249745 master-0 kubenswrapper[13205]: I0319 09:25:27.249677 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46a87945-656e-4154-9235-644a90bffe83-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-6kvc4\" (UID: \"46a87945-656e-4154-9235-644a90bffe83\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.249809 master-0 kubenswrapper[13205]: I0319 09:25:27.249761 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86f98011-564c-4f08-8b8e-9d0518b77945-metrics-client-ca\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.261509 master-0 kubenswrapper[13205]: I0319 09:25:27.261456 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46a87945-656e-4154-9235-644a90bffe83-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-6kvc4\" (UID: \"46a87945-656e-4154-9235-644a90bffe83\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.306190 master-0 kubenswrapper[13205]: I0319 09:25:27.306121 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-slmgx_58ea8fcc-29b2-48ef-8629-2ba217c9d70c/approver/1.log" Mar 19 09:25:27.307403 master-0 kubenswrapper[13205]: I0319 09:25:27.307377 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4km5\" (UniqueName: \"kubernetes.io/projected/46a87945-656e-4154-9235-644a90bffe83-kube-api-access-w4km5\") pod \"openshift-state-metrics-5dc6c74576-6kvc4\" (UID: \"46a87945-656e-4154-9235-644a90bffe83\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.351398 master-0 kubenswrapper[13205]: I0319 09:25:27.351340 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.351607 master-0 kubenswrapper[13205]: I0319 09:25:27.351440 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.351607 master-0 kubenswrapper[13205]: I0319 09:25:27.351541 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86f98011-564c-4f08-8b8e-9d0518b77945-metrics-client-ca\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.351607 master-0 kubenswrapper[13205]: I0319 09:25:27.351570 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/86f98011-564c-4f08-8b8e-9d0518b77945-node-exporter-textfile\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.352816 master-0 kubenswrapper[13205]: I0319 09:25:27.352520 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86f98011-564c-4f08-8b8e-9d0518b77945-metrics-client-ca\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.352816 master-0 kubenswrapper[13205]: I0319 09:25:27.351592 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/86f98011-564c-4f08-8b8e-9d0518b77945-node-exporter-wtmp\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.352816 master-0 kubenswrapper[13205]: I0319 09:25:27.352698 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/86f98011-564c-4f08-8b8e-9d0518b77945-root\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.352816 master-0 kubenswrapper[13205]: I0319 09:25:27.352745 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.352816 master-0 kubenswrapper[13205]: I0319 09:25:27.352776 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86f98011-564c-4f08-8b8e-9d0518b77945-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.353087 master-0 kubenswrapper[13205]: I0319 09:25:27.352815 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.353087 master-0 kubenswrapper[13205]: I0319 09:25:27.352859 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.353087 master-0 kubenswrapper[13205]: I0319 09:25:27.352861 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/86f98011-564c-4f08-8b8e-9d0518b77945-node-exporter-wtmp\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.353087 master-0 kubenswrapper[13205]: I0319 09:25:27.352876 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/86f98011-564c-4f08-8b8e-9d0518b77945-node-exporter-tls\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.353087 master-0 kubenswrapper[13205]: I0319 09:25:27.352871 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/86f98011-564c-4f08-8b8e-9d0518b77945-root\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.353869 master-0 kubenswrapper[13205]: I0319 09:25:27.353174 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.353869 master-0 kubenswrapper[13205]: I0319 09:25:27.353206 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fjvm\" (UniqueName: \"kubernetes.io/projected/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-kube-api-access-6fjvm\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.353869 master-0 kubenswrapper[13205]: I0319 09:25:27.353221 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.353869 master-0 kubenswrapper[13205]: I0319 09:25:27.353264 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86f98011-564c-4f08-8b8e-9d0518b77945-sys\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.353869 master-0 kubenswrapper[13205]: I0319 09:25:27.353292 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4wpd\" (UniqueName: \"kubernetes.io/projected/86f98011-564c-4f08-8b8e-9d0518b77945-kube-api-access-t4wpd\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.353869 master-0 kubenswrapper[13205]: I0319 09:25:27.353542 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86f98011-564c-4f08-8b8e-9d0518b77945-sys\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.354320 master-0 kubenswrapper[13205]: I0319 09:25:27.354205 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/86f98011-564c-4f08-8b8e-9d0518b77945-node-exporter-textfile\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.355808 master-0 kubenswrapper[13205]: I0319 09:25:27.355784 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.356331 master-0 kubenswrapper[13205]: I0319 09:25:27.356301 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.356901 master-0 kubenswrapper[13205]: I0319 09:25:27.356843 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.357987 master-0 kubenswrapper[13205]: I0319 09:25:27.357963 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/86f98011-564c-4f08-8b8e-9d0518b77945-node-exporter-tls\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.361119 master-0 kubenswrapper[13205]: I0319 09:25:27.361062 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86f98011-564c-4f08-8b8e-9d0518b77945-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.414299 master-0 kubenswrapper[13205]: I0319 09:25:27.414226 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" Mar 19 09:25:27.499908 master-0 kubenswrapper[13205]: I0319 09:25:27.499758 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qfc76_4f65184f-8fc2-4656-8776-a3b962aa1f5d/iptables-alerter/0.log" Mar 19 09:25:27.503126 master-0 kubenswrapper[13205]: I0319 09:25:27.503079 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4wpd\" (UniqueName: \"kubernetes.io/projected/86f98011-564c-4f08-8b8e-9d0518b77945-kube-api-access-t4wpd\") pod \"node-exporter-pfzk7\" (UID: \"86f98011-564c-4f08-8b8e-9d0518b77945\") " pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.739435 master-0 kubenswrapper[13205]: I0319 09:25:27.737581 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fjvm\" (UniqueName: \"kubernetes.io/projected/9427be32-8a99-4a07-aec9-5fe1ddcf1e2f-kube-api-access-6fjvm\") pod \"kube-state-metrics-7bbc969446-dtbjl\" (UID: \"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:27.745830 master-0 kubenswrapper[13205]: I0319 09:25:27.745778 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-b4d28_4abcf2ea-50f5-4d62-8a23-583438e5b451/network-operator/0.log" Mar 19 09:25:27.800596 master-0 kubenswrapper[13205]: I0319 09:25:27.800567 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pfzk7" Mar 19 09:25:27.814901 master-0 kubenswrapper[13205]: I0319 09:25:27.811843 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" Mar 19 09:25:28.383141 master-0 kubenswrapper[13205]: I0319 09:25:28.379852 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4"] Mar 19 09:25:28.383650 master-0 kubenswrapper[13205]: I0319 09:25:28.383248 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-b4d28_4abcf2ea-50f5-4d62-8a23-583438e5b451/network-operator/1.log" Mar 19 09:25:28.589991 master-0 kubenswrapper[13205]: I0319 09:25:28.589899 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-55s59_3b333a1e-2a7f-423a-8b40-99f30c89f740/openshift-apiserver-operator/0.log" Mar 19 09:25:28.820903 master-0 kubenswrapper[13205]: W0319 09:25:28.820794 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46a87945_656e_4154_9235_644a90bffe83.slice/crio-660ac90d514127a7cc99953204d7c96b4e7085b6524fc837d3bf10f15f07f2a8 WatchSource:0}: Error finding container 660ac90d514127a7cc99953204d7c96b4e7085b6524fc837d3bf10f15f07f2a8: Status 404 returned error can't find the container with id 660ac90d514127a7cc99953204d7c96b4e7085b6524fc837d3bf10f15f07f2a8 Mar 19 09:25:28.831682 master-0 kubenswrapper[13205]: I0319 09:25:28.831624 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-55s59_3b333a1e-2a7f-423a-8b40-99f30c89f740/openshift-apiserver-operator/1.log" Mar 19 09:25:28.984485 master-0 kubenswrapper[13205]: I0319 09:25:28.983073 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-7dcf67dd86-6hgld_64f60856-22dd-4560-acff-c620e17844a1/fix-audit-permissions/0.log" Mar 19 09:25:29.049402 master-0 kubenswrapper[13205]: I0319 09:25:29.049369 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-7dcf67dd86-6hgld_64f60856-22dd-4560-acff-c620e17844a1/openshift-apiserver/0.log" Mar 19 09:25:29.085891 master-0 kubenswrapper[13205]: I0319 09:25:29.085785 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" event={"ID":"46a87945-656e-4154-9235-644a90bffe83","Type":"ContainerStarted","Data":"660ac90d514127a7cc99953204d7c96b4e7085b6524fc837d3bf10f15f07f2a8"} Mar 19 09:25:29.086548 master-0 kubenswrapper[13205]: I0319 09:25:29.086498 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pfzk7" event={"ID":"86f98011-564c-4f08-8b8e-9d0518b77945","Type":"ContainerStarted","Data":"6f10152eb3d7a1c0aed0689c3b8b9530212de525141c4d7bdc2762aae32272a8"} Mar 19 09:25:30.114948 master-0 kubenswrapper[13205]: I0319 09:25:30.114898 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" event={"ID":"46a87945-656e-4154-9235-644a90bffe83","Type":"ContainerStarted","Data":"9c227677792d4e05938668f6e0bc7a2a2be494dfb80cdab3b51c0132f1c8077a"} Mar 19 09:25:31.123031 master-0 kubenswrapper[13205]: I0319 09:25:31.122960 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" event={"ID":"46a87945-656e-4154-9235-644a90bffe83","Type":"ContainerStarted","Data":"7c64241388e9335735613ef85686c88b23b757b15d3fd57857e32b97808986c8"} Mar 19 09:25:32.002283 master-0 kubenswrapper[13205]: I0319 09:25:32.002218 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl"] Mar 19 09:25:32.013415 master-0 kubenswrapper[13205]: W0319 09:25:32.013375 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9427be32_8a99_4a07_aec9_5fe1ddcf1e2f.slice/crio-367d34cba50e02aa128f1f7c44362b16ccac3a277f714b3eeadaa9c7d73a2c7c WatchSource:0}: Error finding container 367d34cba50e02aa128f1f7c44362b16ccac3a277f714b3eeadaa9c7d73a2c7c: Status 404 returned error can't find the container with id 367d34cba50e02aa128f1f7c44362b16ccac3a277f714b3eeadaa9c7d73a2c7c Mar 19 09:25:32.131211 master-0 kubenswrapper[13205]: I0319 09:25:32.131150 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" event={"ID":"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f","Type":"ContainerStarted","Data":"367d34cba50e02aa128f1f7c44362b16ccac3a277f714b3eeadaa9c7d73a2c7c"} Mar 19 09:25:33.489993 master-0 kubenswrapper[13205]: I0319 09:25:33.489870 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-7dcf67dd86-6hgld_64f60856-22dd-4560-acff-c620e17844a1/openshift-apiserver-check-endpoints/0.log" Mar 19 09:25:33.933332 master-0 kubenswrapper[13205]: I0319 09:25:33.933271 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-5bddk_a1098584-43b9-4f2c-83d2-22d95fb7b0c3/etcd-operator/0.log" Mar 19 09:25:33.959310 master-0 kubenswrapper[13205]: I0319 09:25:33.959179 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-5bddk_a1098584-43b9-4f2c-83d2-22d95fb7b0c3/etcd-operator/1.log" Mar 19 09:25:33.981337 master-0 kubenswrapper[13205]: I0319 09:25:33.981292 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-68f85b4d6c-jg9m5_259794ab-d027-497a-b08e-5a6d79057668/catalog-operator/0.log" Mar 19 09:25:34.009478 master-0 kubenswrapper[13205]: I0319 09:25:34.008370 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-68f85b4d6c-jg9m5_259794ab-d027-497a-b08e-5a6d79057668/catalog-operator/1.log" Mar 19 09:25:34.023376 master-0 kubenswrapper[13205]: I0319 09:25:34.023339 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-5c9796789-rh692_7b29cb7b-26d2-4fab-9e03-2d7fdf937592/olm-operator/0.log" Mar 19 09:25:34.051140 master-0 kubenswrapper[13205]: I0319 09:25:34.048686 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-5c9796789-rh692_7b29cb7b-26d2-4fab-9e03-2d7fdf937592/olm-operator/1.log" Mar 19 09:25:34.058636 master-0 kubenswrapper[13205]: I0319 09:25:34.058194 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-5jsnd_9a6c1523-e77c-4aac-814c-05d41215c42f/kube-rbac-proxy/0.log" Mar 19 09:25:34.075263 master-0 kubenswrapper[13205]: I0319 09:25:34.075218 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-5jsnd_9a6c1523-e77c-4aac-814c-05d41215c42f/package-server-manager/0.log" Mar 19 09:25:34.095638 master-0 kubenswrapper[13205]: I0319 09:25:34.095600 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_packageserver-6f5bddd45b-hzcnw_588cf947-93a7-4e1d-b2fe-a281cb4eb44e/packageserver/0.log" Mar 19 09:25:34.161564 master-0 kubenswrapper[13205]: I0319 09:25:34.161215 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:25:34.170474 master-0 kubenswrapper[13205]: I0319 09:25:34.163397 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.170474 master-0 kubenswrapper[13205]: I0319 09:25:34.168168 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-252nv" Mar 19 09:25:34.170474 master-0 kubenswrapper[13205]: I0319 09:25:34.168342 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 09:25:34.170474 master-0 kubenswrapper[13205]: I0319 09:25:34.168460 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 09:25:34.170474 master-0 kubenswrapper[13205]: I0319 09:25:34.168569 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 09:25:34.170474 master-0 kubenswrapper[13205]: I0319 09:25:34.168698 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 09:25:34.170474 master-0 kubenswrapper[13205]: I0319 09:25:34.168846 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 09:25:34.170474 master-0 kubenswrapper[13205]: I0319 09:25:34.169634 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 09:25:34.170474 master-0 kubenswrapper[13205]: I0319 09:25:34.169728 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 09:25:34.175859 master-0 kubenswrapper[13205]: I0319 09:25:34.173086 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:25:34.177638 master-0 kubenswrapper[13205]: I0319 09:25:34.177600 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 09:25:34.290546 master-0 kubenswrapper[13205]: I0319 09:25:34.276158 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.290546 master-0 kubenswrapper[13205]: I0319 09:25:34.276212 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.290546 master-0 kubenswrapper[13205]: I0319 09:25:34.276244 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.290546 master-0 kubenswrapper[13205]: I0319 09:25:34.276271 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.290546 master-0 kubenswrapper[13205]: I0319 09:25:34.276301 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-web-config\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.290546 master-0 kubenswrapper[13205]: I0319 09:25:34.276324 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x49wh\" (UniqueName: \"kubernetes.io/projected/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-kube-api-access-x49wh\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.290546 master-0 kubenswrapper[13205]: I0319 09:25:34.276350 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.290546 master-0 kubenswrapper[13205]: I0319 09:25:34.276549 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-config-out\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.290546 master-0 kubenswrapper[13205]: I0319 09:25:34.276647 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.290546 master-0 kubenswrapper[13205]: I0319 09:25:34.276769 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.290546 master-0 kubenswrapper[13205]: I0319 09:25:34.276812 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-config-volume\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.290546 master-0 kubenswrapper[13205]: I0319 09:25:34.276827 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.389387 master-0 kubenswrapper[13205]: I0319 09:25:34.389322 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.389545 master-0 kubenswrapper[13205]: I0319 09:25:34.389496 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-config-volume\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.389700 master-0 kubenswrapper[13205]: I0319 09:25:34.389604 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.389778 master-0 kubenswrapper[13205]: I0319 09:25:34.389678 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.389934 master-0 kubenswrapper[13205]: I0319 09:25:34.389868 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.389934 master-0 kubenswrapper[13205]: I0319 09:25:34.389897 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.390104 master-0 kubenswrapper[13205]: I0319 09:25:34.390030 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.390104 master-0 kubenswrapper[13205]: I0319 09:25:34.390080 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-web-config\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.390268 master-0 kubenswrapper[13205]: I0319 09:25:34.390214 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x49wh\" (UniqueName: \"kubernetes.io/projected/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-kube-api-access-x49wh\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.390345 master-0 kubenswrapper[13205]: I0319 09:25:34.390319 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.390499 master-0 kubenswrapper[13205]: I0319 09:25:34.390450 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-config-out\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.390499 master-0 kubenswrapper[13205]: I0319 09:25:34.390477 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.392396 master-0 kubenswrapper[13205]: I0319 09:25:34.392310 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.392727 master-0 kubenswrapper[13205]: I0319 09:25:34.392644 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.392991 master-0 kubenswrapper[13205]: I0319 09:25:34.392941 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.403592 master-0 kubenswrapper[13205]: I0319 09:25:34.403514 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.406029 master-0 kubenswrapper[13205]: I0319 09:25:34.405983 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-config-volume\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.406433 master-0 kubenswrapper[13205]: I0319 09:25:34.406388 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.407874 master-0 kubenswrapper[13205]: I0319 09:25:34.407745 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.409973 master-0 kubenswrapper[13205]: I0319 09:25:34.409954 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.410652 master-0 kubenswrapper[13205]: I0319 09:25:34.410622 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-config-out\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.411380 master-0 kubenswrapper[13205]: I0319 09:25:34.411346 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.411438 master-0 kubenswrapper[13205]: I0319 09:25:34.411403 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-web-config\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.425357 master-0 kubenswrapper[13205]: I0319 09:25:34.425229 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x49wh\" (UniqueName: \"kubernetes.io/projected/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-kube-api-access-x49wh\") pod \"alertmanager-main-0\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:34.498776 master-0 kubenswrapper[13205]: I0319 09:25:34.498562 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:25:35.138576 master-0 kubenswrapper[13205]: I0319 09:25:35.137021 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:25:35.167872 master-0 kubenswrapper[13205]: I0319 09:25:35.167159 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-79jrh" event={"ID":"745093e5-ffe1-4443-b317-448948f3b311","Type":"ContainerStarted","Data":"a2a8428e87d8ba21512d409f07b64c82ad97b0b19c47182c0292e69f16d22050"} Mar 19 09:25:35.167872 master-0 kubenswrapper[13205]: I0319 09:25:35.167201 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-79jrh" event={"ID":"745093e5-ffe1-4443-b317-448948f3b311","Type":"ContainerStarted","Data":"3564ecb707df62299295c484ddb91ff129b5cbdc44914c5687d4294c481f76c4"} Mar 19 09:25:35.167872 master-0 kubenswrapper[13205]: I0319 09:25:35.167350 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-79jrh" Mar 19 09:25:35.219590 master-0 kubenswrapper[13205]: E0319 09:25:35.216697 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:25:35.256891 master-0 kubenswrapper[13205]: W0319 09:25:35.256830 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d9f4bd_97d8_42be_b5a7_0c8cbf45350b.slice/crio-e5b3016c33ac078d83ccd3c0c97f494889360e9c7848774d638179ce5e816894 WatchSource:0}: Error finding container e5b3016c33ac078d83ccd3c0c97f494889360e9c7848774d638179ce5e816894: Status 404 returned error can't find the container with id e5b3016c33ac078d83ccd3c0c97f494889360e9c7848774d638179ce5e816894 Mar 19 09:25:35.337187 master-0 kubenswrapper[13205]: I0319 09:25:35.337109 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-79jrh" podStartSLOduration=67.183042227 podStartE2EDuration="1m15.33709001s" podCreationTimestamp="2026-03-19 09:24:20 +0000 UTC" firstStartedPulling="2026-03-19 09:25:25.787284073 +0000 UTC m=+111.119590961" lastFinishedPulling="2026-03-19 09:25:33.941331856 +0000 UTC m=+119.273638744" observedRunningTime="2026-03-19 09:25:35.31658073 +0000 UTC m=+120.648887628" watchObservedRunningTime="2026-03-19 09:25:35.33709001 +0000 UTC m=+120.669396898" Mar 19 09:25:35.340126 master-0 kubenswrapper[13205]: I0319 09:25:35.340028 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6785466799-nphk8"] Mar 19 09:25:35.340976 master-0 kubenswrapper[13205]: I0319 09:25:35.340945 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6785466799-nphk8" Mar 19 09:25:35.345384 master-0 kubenswrapper[13205]: I0319 09:25:35.345339 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-fdmh2" Mar 19 09:25:35.345562 master-0 kubenswrapper[13205]: I0319 09:25:35.345406 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 09:25:35.359680 master-0 kubenswrapper[13205]: I0319 09:25:35.359633 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6785466799-nphk8"] Mar 19 09:25:35.519294 master-0 kubenswrapper[13205]: I0319 09:25:35.519225 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2d6d2016-5c9a-4772-b247-255563ba9fad-monitoring-plugin-cert\") pod \"monitoring-plugin-6785466799-nphk8\" (UID: \"2d6d2016-5c9a-4772-b247-255563ba9fad\") " pod="openshift-monitoring/monitoring-plugin-6785466799-nphk8" Mar 19 09:25:35.620991 master-0 kubenswrapper[13205]: I0319 09:25:35.620936 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2d6d2016-5c9a-4772-b247-255563ba9fad-monitoring-plugin-cert\") pod \"monitoring-plugin-6785466799-nphk8\" (UID: \"2d6d2016-5c9a-4772-b247-255563ba9fad\") " pod="openshift-monitoring/monitoring-plugin-6785466799-nphk8" Mar 19 09:25:35.641815 master-0 kubenswrapper[13205]: I0319 09:25:35.641753 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2d6d2016-5c9a-4772-b247-255563ba9fad-monitoring-plugin-cert\") pod \"monitoring-plugin-6785466799-nphk8\" (UID: \"2d6d2016-5c9a-4772-b247-255563ba9fad\") " pod="openshift-monitoring/monitoring-plugin-6785466799-nphk8" Mar 19 09:25:35.686454 master-0 kubenswrapper[13205]: I0319 09:25:35.686356 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6785466799-nphk8" Mar 19 09:25:36.173763 master-0 kubenswrapper[13205]: I0319 09:25:36.173697 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerStarted","Data":"e5b3016c33ac078d83ccd3c0c97f494889360e9c7848774d638179ce5e816894"} Mar 19 09:25:36.884263 master-0 kubenswrapper[13205]: I0319 09:25:36.883270 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-594b9755d9-zlw92"] Mar 19 09:25:36.890888 master-0 kubenswrapper[13205]: I0319 09:25:36.890842 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:36.895747 master-0 kubenswrapper[13205]: I0319 09:25:36.893203 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 09:25:36.895747 master-0 kubenswrapper[13205]: I0319 09:25:36.893346 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 09:25:36.895747 master-0 kubenswrapper[13205]: I0319 09:25:36.893895 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 09:25:36.895747 master-0 kubenswrapper[13205]: I0319 09:25:36.893924 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-jvtc2" Mar 19 09:25:36.895747 master-0 kubenswrapper[13205]: I0319 09:25:36.893957 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-6j7vofh1gbciq" Mar 19 09:25:36.895747 master-0 kubenswrapper[13205]: I0319 09:25:36.894042 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 09:25:36.897710 master-0 kubenswrapper[13205]: I0319 09:25:36.897290 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 09:25:37.019638 master-0 kubenswrapper[13205]: I0319 09:25:37.019590 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-594b9755d9-zlw92"] Mar 19 09:25:37.042628 master-0 kubenswrapper[13205]: I0319 09:25:37.042563 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.042628 master-0 kubenswrapper[13205]: I0319 09:25:37.042623 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.042927 master-0 kubenswrapper[13205]: I0319 09:25:37.042764 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.043063 master-0 kubenswrapper[13205]: I0319 09:25:37.043033 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-metrics-client-ca\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.043155 master-0 kubenswrapper[13205]: I0319 09:25:37.043066 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k299h\" (UniqueName: \"kubernetes.io/projected/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-kube-api-access-k299h\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.043239 master-0 kubenswrapper[13205]: I0319 09:25:37.043212 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-tls\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.043388 master-0 kubenswrapper[13205]: I0319 09:25:37.043346 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.043544 master-0 kubenswrapper[13205]: I0319 09:25:37.043486 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-grpc-tls\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.147492 master-0 kubenswrapper[13205]: I0319 09:25:37.145189 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.147492 master-0 kubenswrapper[13205]: I0319 09:25:37.145247 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.147492 master-0 kubenswrapper[13205]: I0319 09:25:37.145969 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-metrics-client-ca\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.147492 master-0 kubenswrapper[13205]: I0319 09:25:37.146021 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k299h\" (UniqueName: \"kubernetes.io/projected/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-kube-api-access-k299h\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.147492 master-0 kubenswrapper[13205]: I0319 09:25:37.146080 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-tls\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.147492 master-0 kubenswrapper[13205]: I0319 09:25:37.146109 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.147492 master-0 kubenswrapper[13205]: I0319 09:25:37.146198 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-grpc-tls\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.147492 master-0 kubenswrapper[13205]: I0319 09:25:37.146256 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.149686 master-0 kubenswrapper[13205]: I0319 09:25:37.149653 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-metrics-client-ca\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.151172 master-0 kubenswrapper[13205]: I0319 09:25:37.151144 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.151550 master-0 kubenswrapper[13205]: I0319 09:25:37.151490 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-tls\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.151799 master-0 kubenswrapper[13205]: I0319 09:25:37.151768 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.151948 master-0 kubenswrapper[13205]: I0319 09:25:37.151918 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.152297 master-0 kubenswrapper[13205]: I0319 09:25:37.152220 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.153207 master-0 kubenswrapper[13205]: I0319 09:25:37.153170 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-secret-grpc-tls\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.382733 master-0 kubenswrapper[13205]: I0319 09:25:37.382644 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k299h\" (UniqueName: \"kubernetes.io/projected/16a30c06-47fd-44c5-8a5f-91374c9fbcdc-kube-api-access-k299h\") pod \"thanos-querier-594b9755d9-zlw92\" (UID: \"16a30c06-47fd-44c5-8a5f-91374c9fbcdc\") " pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:37.518555 master-0 kubenswrapper[13205]: I0319 09:25:37.518492 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:39.193449 master-0 kubenswrapper[13205]: I0319 09:25:39.193262 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" event={"ID":"46a87945-656e-4154-9235-644a90bffe83","Type":"ContainerStarted","Data":"6b1f88919ccb298b51fbf6dc022c01fca98bb5e7502b772692dbfff7f1c1d767"} Mar 19 09:25:40.200765 master-0 kubenswrapper[13205]: I0319 09:25:40.200639 13205 generic.go:334] "Generic (PLEG): container finished" podID="86f98011-564c-4f08-8b8e-9d0518b77945" containerID="75b71882b1f2366bf105ceab243e734efa6e0608ce83de8155f62afe7186b9be" exitCode=0 Mar 19 09:25:40.200765 master-0 kubenswrapper[13205]: I0319 09:25:40.200751 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pfzk7" event={"ID":"86f98011-564c-4f08-8b8e-9d0518b77945","Type":"ContainerDied","Data":"75b71882b1f2366bf105ceab243e734efa6e0608ce83de8155f62afe7186b9be"} Mar 19 09:25:40.202449 master-0 kubenswrapper[13205]: I0319 09:25:40.202415 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" event={"ID":"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f","Type":"ContainerStarted","Data":"98601ac42ea8a4fc7a34afb3c33a2f14d1e4e6cd8d0384d45c3a6b96a496b288"} Mar 19 09:25:41.216039 master-0 kubenswrapper[13205]: I0319 09:25:41.215239 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" event={"ID":"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f","Type":"ContainerStarted","Data":"915183ea8f29cbd8bd727ec4f8af04c9e58feb0f64207ecc89980f09470c6c0a"} Mar 19 09:25:41.867586 master-0 kubenswrapper[13205]: I0319 09:25:41.861742 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6785466799-nphk8"] Mar 19 09:25:41.890700 master-0 kubenswrapper[13205]: I0319 09:25:41.890635 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-594b9755d9-zlw92"] Mar 19 09:25:41.900857 master-0 kubenswrapper[13205]: I0319 09:25:41.892895 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-6kvc4" podStartSLOduration=9.105181472 podStartE2EDuration="15.892870483s" podCreationTimestamp="2026-03-19 09:25:26 +0000 UTC" firstStartedPulling="2026-03-19 09:25:31.153493028 +0000 UTC m=+116.485799916" lastFinishedPulling="2026-03-19 09:25:37.941182039 +0000 UTC m=+123.273488927" observedRunningTime="2026-03-19 09:25:41.882603726 +0000 UTC m=+127.214910614" watchObservedRunningTime="2026-03-19 09:25:41.892870483 +0000 UTC m=+127.225177381" Mar 19 09:25:42.252150 master-0 kubenswrapper[13205]: W0319 09:25:42.252050 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d6d2016_5c9a_4772_b247_255563ba9fad.slice/crio-27992dfd9b3ef7634533dec838cdefba65dc15535ad849206d8f6ceb94cabe88 WatchSource:0}: Error finding container 27992dfd9b3ef7634533dec838cdefba65dc15535ad849206d8f6ceb94cabe88: Status 404 returned error can't find the container with id 27992dfd9b3ef7634533dec838cdefba65dc15535ad849206d8f6ceb94cabe88 Mar 19 09:25:42.328902 master-0 kubenswrapper[13205]: I0319 09:25:42.328857 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-fd9c84ccc-jlrzf"] Mar 19 09:25:42.329825 master-0 kubenswrapper[13205]: I0319 09:25:42.329800 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.342113 master-0 kubenswrapper[13205]: I0319 09:25:42.336804 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-avpd2mlhiq4t" Mar 19 09:25:42.346287 master-0 kubenswrapper[13205]: I0319 09:25:42.346209 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-fd9c84ccc-jlrzf"] Mar 19 09:25:42.357577 master-0 kubenswrapper[13205]: I0319 09:25:42.346612 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 09:25:42.357577 master-0 kubenswrapper[13205]: I0319 09:25:42.353516 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 09:25:42.357577 master-0 kubenswrapper[13205]: I0319 09:25:42.353740 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 09:25:42.357577 master-0 kubenswrapper[13205]: I0319 09:25:42.354074 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 09:25:42.357577 master-0 kubenswrapper[13205]: I0319 09:25:42.354189 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-2rqbc" Mar 19 09:25:42.364391 master-0 kubenswrapper[13205]: I0319 09:25:42.364197 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a8dce8-1815-4676-ab46-2cce5bc21bfd-client-ca-bundle\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.364478 master-0 kubenswrapper[13205]: I0319 09:25:42.364423 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36a8dce8-1815-4676-ab46-2cce5bc21bfd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.364478 master-0 kubenswrapper[13205]: I0319 09:25:42.364471 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/36a8dce8-1815-4676-ab46-2cce5bc21bfd-secret-metrics-client-certs\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.364587 master-0 kubenswrapper[13205]: I0319 09:25:42.364556 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/36a8dce8-1815-4676-ab46-2cce5bc21bfd-secret-metrics-server-tls\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.364623 master-0 kubenswrapper[13205]: I0319 09:25:42.364587 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtnjb\" (UniqueName: \"kubernetes.io/projected/36a8dce8-1815-4676-ab46-2cce5bc21bfd-kube-api-access-jtnjb\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.364654 master-0 kubenswrapper[13205]: I0319 09:25:42.364629 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/36a8dce8-1815-4676-ab46-2cce5bc21bfd-metrics-server-audit-profiles\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.364717 master-0 kubenswrapper[13205]: I0319 09:25:42.364694 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/36a8dce8-1815-4676-ab46-2cce5bc21bfd-audit-log\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.469878 master-0 kubenswrapper[13205]: I0319 09:25:42.466685 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/36a8dce8-1815-4676-ab46-2cce5bc21bfd-secret-metrics-server-tls\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.469878 master-0 kubenswrapper[13205]: I0319 09:25:42.466744 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtnjb\" (UniqueName: \"kubernetes.io/projected/36a8dce8-1815-4676-ab46-2cce5bc21bfd-kube-api-access-jtnjb\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.469878 master-0 kubenswrapper[13205]: I0319 09:25:42.466782 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/36a8dce8-1815-4676-ab46-2cce5bc21bfd-metrics-server-audit-profiles\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.469878 master-0 kubenswrapper[13205]: I0319 09:25:42.466823 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/36a8dce8-1815-4676-ab46-2cce5bc21bfd-audit-log\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.469878 master-0 kubenswrapper[13205]: I0319 09:25:42.466875 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a8dce8-1815-4676-ab46-2cce5bc21bfd-client-ca-bundle\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.469878 master-0 kubenswrapper[13205]: I0319 09:25:42.466910 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36a8dce8-1815-4676-ab46-2cce5bc21bfd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.469878 master-0 kubenswrapper[13205]: I0319 09:25:42.466941 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/36a8dce8-1815-4676-ab46-2cce5bc21bfd-secret-metrics-client-certs\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.469878 master-0 kubenswrapper[13205]: I0319 09:25:42.469006 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/36a8dce8-1815-4676-ab46-2cce5bc21bfd-metrics-server-audit-profiles\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.474556 master-0 kubenswrapper[13205]: I0319 09:25:42.473033 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36a8dce8-1815-4676-ab46-2cce5bc21bfd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.474556 master-0 kubenswrapper[13205]: I0319 09:25:42.473297 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/36a8dce8-1815-4676-ab46-2cce5bc21bfd-audit-log\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.480558 master-0 kubenswrapper[13205]: I0319 09:25:42.478485 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/36a8dce8-1815-4676-ab46-2cce5bc21bfd-secret-metrics-client-certs\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.480558 master-0 kubenswrapper[13205]: I0319 09:25:42.478727 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/36a8dce8-1815-4676-ab46-2cce5bc21bfd-secret-metrics-server-tls\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.506755 master-0 kubenswrapper[13205]: I0319 09:25:42.505966 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a8dce8-1815-4676-ab46-2cce5bc21bfd-client-ca-bundle\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.520596 master-0 kubenswrapper[13205]: I0319 09:25:42.520170 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtnjb\" (UniqueName: \"kubernetes.io/projected/36a8dce8-1815-4676-ab46-2cce5bc21bfd-kube-api-access-jtnjb\") pod \"metrics-server-fd9c84ccc-jlrzf\" (UID: \"36a8dce8-1815-4676-ab46-2cce5bc21bfd\") " pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.683756 master-0 kubenswrapper[13205]: I0319 09:25:42.681252 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:25:42.752447 master-0 kubenswrapper[13205]: I0319 09:25:42.752298 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_155018d1-af14-4adc-b7a0-cab0133dd65f/installer/0.log" Mar 19 09:25:42.752447 master-0 kubenswrapper[13205]: I0319 09:25:42.752411 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:42.769975 master-0 kubenswrapper[13205]: I0319 09:25:42.769898 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/155018d1-af14-4adc-b7a0-cab0133dd65f-var-lock\") pod \"155018d1-af14-4adc-b7a0-cab0133dd65f\" (UID: \"155018d1-af14-4adc-b7a0-cab0133dd65f\") " Mar 19 09:25:42.770075 master-0 kubenswrapper[13205]: I0319 09:25:42.770039 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/155018d1-af14-4adc-b7a0-cab0133dd65f-kube-api-access\") pod \"155018d1-af14-4adc-b7a0-cab0133dd65f\" (UID: \"155018d1-af14-4adc-b7a0-cab0133dd65f\") " Mar 19 09:25:42.770075 master-0 kubenswrapper[13205]: I0319 09:25:42.770039 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/155018d1-af14-4adc-b7a0-cab0133dd65f-var-lock" (OuterVolumeSpecName: "var-lock") pod "155018d1-af14-4adc-b7a0-cab0133dd65f" (UID: "155018d1-af14-4adc-b7a0-cab0133dd65f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:25:42.770134 master-0 kubenswrapper[13205]: I0319 09:25:42.770062 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/155018d1-af14-4adc-b7a0-cab0133dd65f-kubelet-dir\") pod \"155018d1-af14-4adc-b7a0-cab0133dd65f\" (UID: \"155018d1-af14-4adc-b7a0-cab0133dd65f\") " Mar 19 09:25:42.770731 master-0 kubenswrapper[13205]: I0319 09:25:42.770694 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/155018d1-af14-4adc-b7a0-cab0133dd65f-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:42.770731 master-0 kubenswrapper[13205]: I0319 09:25:42.770092 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/155018d1-af14-4adc-b7a0-cab0133dd65f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "155018d1-af14-4adc-b7a0-cab0133dd65f" (UID: "155018d1-af14-4adc-b7a0-cab0133dd65f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:25:42.788309 master-0 kubenswrapper[13205]: I0319 09:25:42.787991 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/155018d1-af14-4adc-b7a0-cab0133dd65f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "155018d1-af14-4adc-b7a0-cab0133dd65f" (UID: "155018d1-af14-4adc-b7a0-cab0133dd65f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:25:42.876996 master-0 kubenswrapper[13205]: I0319 09:25:42.872800 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/155018d1-af14-4adc-b7a0-cab0133dd65f-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:42.876996 master-0 kubenswrapper[13205]: I0319 09:25:42.872837 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/155018d1-af14-4adc-b7a0-cab0133dd65f-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:43.123886 master-0 kubenswrapper[13205]: I0319 09:25:43.123838 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-fd9c84ccc-jlrzf"] Mar 19 09:25:43.129730 master-0 kubenswrapper[13205]: W0319 09:25:43.127734 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36a8dce8_1815_4676_ab46_2cce5bc21bfd.slice/crio-b24f0b8227453849d0343dc7f0fcd24c2765298e3be580b90d1d61e27b9cf8c2 WatchSource:0}: Error finding container b24f0b8227453849d0343dc7f0fcd24c2765298e3be580b90d1d61e27b9cf8c2: Status 404 returned error can't find the container with id b24f0b8227453849d0343dc7f0fcd24c2765298e3be580b90d1d61e27b9cf8c2 Mar 19 09:25:43.227868 master-0 kubenswrapper[13205]: I0319 09:25:43.227774 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6785466799-nphk8" event={"ID":"2d6d2016-5c9a-4772-b247-255563ba9fad","Type":"ContainerStarted","Data":"27992dfd9b3ef7634533dec838cdefba65dc15535ad849206d8f6ceb94cabe88"} Mar 19 09:25:43.229654 master-0 kubenswrapper[13205]: I0319 09:25:43.229612 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" event={"ID":"36a8dce8-1815-4676-ab46-2cce5bc21bfd","Type":"ContainerStarted","Data":"b24f0b8227453849d0343dc7f0fcd24c2765298e3be580b90d1d61e27b9cf8c2"} Mar 19 09:25:43.232348 master-0 kubenswrapper[13205]: I0319 09:25:43.232282 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" event={"ID":"9427be32-8a99-4a07-aec9-5fe1ddcf1e2f","Type":"ContainerStarted","Data":"0fbc2e8094e4c8fe973a7f5c5eef1876aa9fd5e89f220e7c6dc20d118b0502ca"} Mar 19 09:25:43.234462 master-0 kubenswrapper[13205]: I0319 09:25:43.234423 13205 generic.go:334] "Generic (PLEG): container finished" podID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerID="90bdbbe49c2c62e2be6bd1d4d57ced9f777ee8aff32c998b313fe88cbe77c54e" exitCode=0 Mar 19 09:25:43.234753 master-0 kubenswrapper[13205]: I0319 09:25:43.234729 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerDied","Data":"90bdbbe49c2c62e2be6bd1d4d57ced9f777ee8aff32c998b313fe88cbe77c54e"} Mar 19 09:25:43.235640 master-0 kubenswrapper[13205]: I0319 09:25:43.235609 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" event={"ID":"16a30c06-47fd-44c5-8a5f-91374c9fbcdc","Type":"ContainerStarted","Data":"cf07f672f392fee92e293ccd6f02cb21801963d8fad97a1ae00e83d058d4287c"} Mar 19 09:25:43.245617 master-0 kubenswrapper[13205]: I0319 09:25:43.245589 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_155018d1-af14-4adc-b7a0-cab0133dd65f/installer/0.log" Mar 19 09:25:43.245818 master-0 kubenswrapper[13205]: I0319 09:25:43.245634 13205 generic.go:334] "Generic (PLEG): container finished" podID="155018d1-af14-4adc-b7a0-cab0133dd65f" containerID="28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c" exitCode=1 Mar 19 09:25:43.245818 master-0 kubenswrapper[13205]: I0319 09:25:43.245693 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"155018d1-af14-4adc-b7a0-cab0133dd65f","Type":"ContainerDied","Data":"28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c"} Mar 19 09:25:43.245818 master-0 kubenswrapper[13205]: I0319 09:25:43.245731 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:43.245818 master-0 kubenswrapper[13205]: I0319 09:25:43.245753 13205 scope.go:117] "RemoveContainer" containerID="28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c" Mar 19 09:25:43.246482 master-0 kubenswrapper[13205]: I0319 09:25:43.245738 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"155018d1-af14-4adc-b7a0-cab0133dd65f","Type":"ContainerDied","Data":"15511e8ed5c9afc7170df0ca1831b83c9ff949924ce0e3b4e77f5ca91a7e9907"} Mar 19 09:25:43.248261 master-0 kubenswrapper[13205]: I0319 09:25:43.248219 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pfzk7" event={"ID":"86f98011-564c-4f08-8b8e-9d0518b77945","Type":"ContainerStarted","Data":"b6b235807feea7bba22b4866e79a1704e6685ac9a74dd8aa238866317dc8cf1b"} Mar 19 09:25:43.248261 master-0 kubenswrapper[13205]: I0319 09:25:43.248254 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pfzk7" event={"ID":"86f98011-564c-4f08-8b8e-9d0518b77945","Type":"ContainerStarted","Data":"92507fe53b22b14763b1be26b7884441b07e88f9d87f53b2a2b3797454f083dc"} Mar 19 09:25:43.255947 master-0 kubenswrapper[13205]: I0319 09:25:43.255480 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dtbjl" podStartSLOduration=10.311004274 podStartE2EDuration="16.255460431s" podCreationTimestamp="2026-03-19 09:25:27 +0000 UTC" firstStartedPulling="2026-03-19 09:25:32.01870565 +0000 UTC m=+117.351012538" lastFinishedPulling="2026-03-19 09:25:37.963161807 +0000 UTC m=+123.295468695" observedRunningTime="2026-03-19 09:25:43.255052571 +0000 UTC m=+128.587359459" watchObservedRunningTime="2026-03-19 09:25:43.255460431 +0000 UTC m=+128.587767319" Mar 19 09:25:43.289479 master-0 kubenswrapper[13205]: I0319 09:25:43.289448 13205 scope.go:117] "RemoveContainer" containerID="28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c" Mar 19 09:25:43.295203 master-0 kubenswrapper[13205]: E0319 09:25:43.294943 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c\": container with ID starting with 28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c not found: ID does not exist" containerID="28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c" Mar 19 09:25:43.295203 master-0 kubenswrapper[13205]: I0319 09:25:43.295013 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c"} err="failed to get container status \"28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c\": rpc error: code = NotFound desc = could not find container \"28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c\": container with ID starting with 28375fc40d90c4d4c8756b6c59e7e8b134d86a4d74beebff391f1b54b29ea47c not found: ID does not exist" Mar 19 09:25:43.297622 master-0 kubenswrapper[13205]: I0319 09:25:43.297012 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pfzk7" podStartSLOduration=8.226342039 podStartE2EDuration="17.296998796s" podCreationTimestamp="2026-03-19 09:25:26 +0000 UTC" firstStartedPulling="2026-03-19 09:25:28.869716972 +0000 UTC m=+114.202023860" lastFinishedPulling="2026-03-19 09:25:37.940373729 +0000 UTC m=+123.272680617" observedRunningTime="2026-03-19 09:25:43.287192862 +0000 UTC m=+128.619499750" watchObservedRunningTime="2026-03-19 09:25:43.296998796 +0000 UTC m=+128.629305684" Mar 19 09:25:43.385645 master-0 kubenswrapper[13205]: I0319 09:25:43.377755 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 09:25:43.388238 master-0 kubenswrapper[13205]: I0319 09:25:43.388188 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 09:25:44.258228 master-0 kubenswrapper[13205]: I0319 09:25:44.258106 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6785466799-nphk8" event={"ID":"2d6d2016-5c9a-4772-b247-255563ba9fad","Type":"ContainerStarted","Data":"8079714eb69fb2c4d9ee8ae82ae34dc5ef217b243a469b91abf0fa02bed6832e"} Mar 19 09:25:44.260672 master-0 kubenswrapper[13205]: I0319 09:25:44.258433 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-6785466799-nphk8" Mar 19 09:25:44.269762 master-0 kubenswrapper[13205]: I0319 09:25:44.268894 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6785466799-nphk8" Mar 19 09:25:44.276594 master-0 kubenswrapper[13205]: I0319 09:25:44.276215 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6785466799-nphk8" podStartSLOduration=7.565630869 podStartE2EDuration="9.276194789s" podCreationTimestamp="2026-03-19 09:25:35 +0000 UTC" firstStartedPulling="2026-03-19 09:25:42.263584951 +0000 UTC m=+127.595891849" lastFinishedPulling="2026-03-19 09:25:43.974148881 +0000 UTC m=+129.306455769" observedRunningTime="2026-03-19 09:25:44.273664776 +0000 UTC m=+129.605971674" watchObservedRunningTime="2026-03-19 09:25:44.276194789 +0000 UTC m=+129.608501677" Mar 19 09:25:44.861559 master-0 kubenswrapper[13205]: I0319 09:25:44.861492 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="155018d1-af14-4adc-b7a0-cab0133dd65f" path="/var/lib/kubelet/pods/155018d1-af14-4adc-b7a0-cab0133dd65f/volumes" Mar 19 09:25:46.249675 master-0 kubenswrapper[13205]: I0319 09:25:46.249628 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-79jrh" Mar 19 09:25:47.299210 master-0 kubenswrapper[13205]: I0319 09:25:47.299121 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" event={"ID":"16a30c06-47fd-44c5-8a5f-91374c9fbcdc","Type":"ContainerStarted","Data":"c74138ecd73e64e44fe47d146651d399d763674c2bddf8265a03668f1caf4247"} Mar 19 09:25:47.299210 master-0 kubenswrapper[13205]: I0319 09:25:47.299186 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" event={"ID":"16a30c06-47fd-44c5-8a5f-91374c9fbcdc","Type":"ContainerStarted","Data":"17fd4c3f5979f3355bba43acf3a2c7e00d821e0799f4ee793eb609c920e36f56"} Mar 19 09:25:47.299210 master-0 kubenswrapper[13205]: I0319 09:25:47.299199 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" event={"ID":"16a30c06-47fd-44c5-8a5f-91374c9fbcdc","Type":"ContainerStarted","Data":"26cb5dd7bdf16e05b0ca48bad7884480197e631b6ac2ba18b059f2e81a4e9bd0"} Mar 19 09:25:47.301338 master-0 kubenswrapper[13205]: I0319 09:25:47.301279 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" event={"ID":"36a8dce8-1815-4676-ab46-2cce5bc21bfd","Type":"ContainerStarted","Data":"f0eafa49f85f1b2c4d7641a63cf1c68c7303a2a8294c1d93702434bf0081f3f8"} Mar 19 09:25:47.313396 master-0 kubenswrapper[13205]: I0319 09:25:47.313094 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerStarted","Data":"b58230ea6728c09693b47e9b3c47f1cdfd5bb13858f7d22418d0a0bdedc1cc40"} Mar 19 09:25:47.313396 master-0 kubenswrapper[13205]: I0319 09:25:47.313378 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerStarted","Data":"f01fafa23647d2c15c12e9ac89a35b57dc0ffe34180d8863d99d69493d129fe2"} Mar 19 09:25:47.313775 master-0 kubenswrapper[13205]: I0319 09:25:47.313744 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerStarted","Data":"3aa897760287be7bf5678ba84bdc3a14c8994b11d15cd84e87e2e74bc308cfec"} Mar 19 09:25:47.313775 master-0 kubenswrapper[13205]: I0319 09:25:47.313766 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerStarted","Data":"4e02fedc03344827c1ab9002c8d47589bccbb85c521ddda8cf2e2351989807a5"} Mar 19 09:25:47.313775 master-0 kubenswrapper[13205]: I0319 09:25:47.313775 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerStarted","Data":"17884e5909f5f6cdc750bcb0af814ac2c5ddff0fda4e8d2db5915fe8b6602930"} Mar 19 09:25:47.323910 master-0 kubenswrapper[13205]: I0319 09:25:47.323844 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" podStartSLOduration=2.183987802 podStartE2EDuration="5.323832752s" podCreationTimestamp="2026-03-19 09:25:42 +0000 UTC" firstStartedPulling="2026-03-19 09:25:43.130561428 +0000 UTC m=+128.462868316" lastFinishedPulling="2026-03-19 09:25:46.270406378 +0000 UTC m=+131.602713266" observedRunningTime="2026-03-19 09:25:47.322246972 +0000 UTC m=+132.654553860" watchObservedRunningTime="2026-03-19 09:25:47.323832752 +0000 UTC m=+132.656139640" Mar 19 09:25:49.328920 master-0 kubenswrapper[13205]: I0319 09:25:49.328857 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" event={"ID":"16a30c06-47fd-44c5-8a5f-91374c9fbcdc","Type":"ContainerStarted","Data":"c6b3832e0aa8c5144e33f02f6dee4714e86db71ce70bbd70e1868188c29b5178"} Mar 19 09:25:49.328920 master-0 kubenswrapper[13205]: I0319 09:25:49.328908 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" event={"ID":"16a30c06-47fd-44c5-8a5f-91374c9fbcdc","Type":"ContainerStarted","Data":"ad9514a81da96370caf1137f239ffc90a862557ce7faeaf9b169b9165cc84b2e"} Mar 19 09:25:49.328920 master-0 kubenswrapper[13205]: I0319 09:25:49.328922 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" event={"ID":"16a30c06-47fd-44c5-8a5f-91374c9fbcdc","Type":"ContainerStarted","Data":"d9f12a4876d0a21a7f8dbd3ee23f0b90a4ede335b3d1cbaaf76f5812f382c95e"} Mar 19 09:25:49.330161 master-0 kubenswrapper[13205]: I0319 09:25:49.330059 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:25:49.336822 master-0 kubenswrapper[13205]: I0319 09:25:49.336765 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerStarted","Data":"0c056e4adcec340ff25f4bf0e81c1d601d8672e70e9ed3ad93cb6aaf58259ee8"} Mar 19 09:25:49.358800 master-0 kubenswrapper[13205]: I0319 09:25:49.358709 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" podStartSLOduration=7.526417833 podStartE2EDuration="13.358689134s" podCreationTimestamp="2026-03-19 09:25:36 +0000 UTC" firstStartedPulling="2026-03-19 09:25:42.262140265 +0000 UTC m=+127.594447163" lastFinishedPulling="2026-03-19 09:25:48.094411576 +0000 UTC m=+133.426718464" observedRunningTime="2026-03-19 09:25:49.356011067 +0000 UTC m=+134.688317965" watchObservedRunningTime="2026-03-19 09:25:49.358689134 +0000 UTC m=+134.690996012" Mar 19 09:25:49.418624 master-0 kubenswrapper[13205]: I0319 09:25:49.414074 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.769841745 podStartE2EDuration="15.414055923s" podCreationTimestamp="2026-03-19 09:25:34 +0000 UTC" firstStartedPulling="2026-03-19 09:25:35.25962171 +0000 UTC m=+120.591928588" lastFinishedPulling="2026-03-19 09:25:48.903835878 +0000 UTC m=+134.236142766" observedRunningTime="2026-03-19 09:25:49.409058249 +0000 UTC m=+134.741365137" watchObservedRunningTime="2026-03-19 09:25:49.414055923 +0000 UTC m=+134.746362811" Mar 19 09:25:51.376251 master-0 kubenswrapper[13205]: I0319 09:25:51.376196 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-594b9755d9-zlw92" Mar 19 09:26:02.682840 master-0 kubenswrapper[13205]: I0319 09:26:02.682709 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:26:02.682840 master-0 kubenswrapper[13205]: I0319 09:26:02.682773 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:26:06.345709 master-0 kubenswrapper[13205]: I0319 09:26:06.345642 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-2k849"] Mar 19 09:26:06.346512 master-0 kubenswrapper[13205]: E0319 09:26:06.345898 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="155018d1-af14-4adc-b7a0-cab0133dd65f" containerName="installer" Mar 19 09:26:06.346512 master-0 kubenswrapper[13205]: I0319 09:26:06.345910 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="155018d1-af14-4adc-b7a0-cab0133dd65f" containerName="installer" Mar 19 09:26:06.346512 master-0 kubenswrapper[13205]: I0319 09:26:06.346095 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="155018d1-af14-4adc-b7a0-cab0133dd65f" containerName="installer" Mar 19 09:26:06.346762 master-0 kubenswrapper[13205]: I0319 09:26:06.346540 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-2k849" Mar 19 09:26:06.348401 master-0 kubenswrapper[13205]: I0319 09:26:06.348362 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 09:26:06.348401 master-0 kubenswrapper[13205]: I0319 09:26:06.348363 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-4wm5n" Mar 19 09:26:06.349779 master-0 kubenswrapper[13205]: I0319 09:26:06.349743 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 09:26:06.367135 master-0 kubenswrapper[13205]: I0319 09:26:06.367089 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-2k849"] Mar 19 09:26:06.483554 master-0 kubenswrapper[13205]: I0319 09:26:06.483269 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc049aa3-0344-4031-a38d-05ad0ac0cf4f-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-2k849\" (UID: \"cc049aa3-0344-4031-a38d-05ad0ac0cf4f\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-2k849" Mar 19 09:26:06.483554 master-0 kubenswrapper[13205]: I0319 09:26:06.483381 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc049aa3-0344-4031-a38d-05ad0ac0cf4f-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-2k849\" (UID: \"cc049aa3-0344-4031-a38d-05ad0ac0cf4f\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-2k849" Mar 19 09:26:06.493272 master-0 kubenswrapper[13205]: I0319 09:26:06.493071 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-66b8ffb895-hfq5p" event={"ID":"437ab63c-8bc0-4761-81fd-0da0052a9628","Type":"ContainerStarted","Data":"e575f4155c1175444fdbfdc44d9448c89da65cc1034dbb5fffe34aeafb98ec03"} Mar 19 09:26:06.494402 master-0 kubenswrapper[13205]: I0319 09:26:06.494049 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-66b8ffb895-hfq5p" Mar 19 09:26:06.498551 master-0 kubenswrapper[13205]: I0319 09:26:06.497743 13205 patch_prober.go:28] interesting pod/downloads-66b8ffb895-hfq5p container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.76:8080/\": dial tcp 10.128.0.76:8080: connect: connection refused" start-of-body= Mar 19 09:26:06.498551 master-0 kubenswrapper[13205]: I0319 09:26:06.497813 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-hfq5p" podUID="437ab63c-8bc0-4761-81fd-0da0052a9628" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.76:8080/\": dial tcp 10.128.0.76:8080: connect: connection refused" Mar 19 09:26:06.592291 master-0 kubenswrapper[13205]: I0319 09:26:06.592182 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc049aa3-0344-4031-a38d-05ad0ac0cf4f-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-2k849\" (UID: \"cc049aa3-0344-4031-a38d-05ad0ac0cf4f\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-2k849" Mar 19 09:26:06.592460 master-0 kubenswrapper[13205]: I0319 09:26:06.592394 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc049aa3-0344-4031-a38d-05ad0ac0cf4f-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-2k849\" (UID: \"cc049aa3-0344-4031-a38d-05ad0ac0cf4f\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-2k849" Mar 19 09:26:06.593438 master-0 kubenswrapper[13205]: I0319 09:26:06.593391 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc049aa3-0344-4031-a38d-05ad0ac0cf4f-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-2k849\" (UID: \"cc049aa3-0344-4031-a38d-05ad0ac0cf4f\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-2k849" Mar 19 09:26:06.598773 master-0 kubenswrapper[13205]: I0319 09:26:06.595116 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc049aa3-0344-4031-a38d-05ad0ac0cf4f-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-2k849\" (UID: \"cc049aa3-0344-4031-a38d-05ad0ac0cf4f\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-2k849" Mar 19 09:26:06.675197 master-0 kubenswrapper[13205]: I0319 09:26:06.675140 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-2k849" Mar 19 09:26:07.066164 master-0 kubenswrapper[13205]: I0319 09:26:07.066091 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-66b8ffb895-hfq5p" podStartSLOduration=2.945873691 podStartE2EDuration="44.066070563s" podCreationTimestamp="2026-03-19 09:25:23 +0000 UTC" firstStartedPulling="2026-03-19 09:25:24.11111519 +0000 UTC m=+109.443422078" lastFinishedPulling="2026-03-19 09:26:05.231312052 +0000 UTC m=+150.563618950" observedRunningTime="2026-03-19 09:26:06.538733643 +0000 UTC m=+151.871040531" watchObservedRunningTime="2026-03-19 09:26:07.066070563 +0000 UTC m=+152.398377451" Mar 19 09:26:07.069369 master-0 kubenswrapper[13205]: I0319 09:26:07.067203 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-2k849"] Mar 19 09:26:07.074327 master-0 kubenswrapper[13205]: W0319 09:26:07.074251 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc049aa3_0344_4031_a38d_05ad0ac0cf4f.slice/crio-0c81a898bfbc5e95d7baf2ae869aa8b31426f2201ec05251a8f385c84a648fea WatchSource:0}: Error finding container 0c81a898bfbc5e95d7baf2ae869aa8b31426f2201ec05251a8f385c84a648fea: Status 404 returned error can't find the container with id 0c81a898bfbc5e95d7baf2ae869aa8b31426f2201ec05251a8f385c84a648fea Mar 19 09:26:07.505227 master-0 kubenswrapper[13205]: I0319 09:26:07.505048 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c6b76c555-2k849" event={"ID":"cc049aa3-0344-4031-a38d-05ad0ac0cf4f","Type":"ContainerStarted","Data":"0c81a898bfbc5e95d7baf2ae869aa8b31426f2201ec05251a8f385c84a648fea"} Mar 19 09:26:07.505812 master-0 kubenswrapper[13205]: I0319 09:26:07.505614 13205 patch_prober.go:28] interesting pod/downloads-66b8ffb895-hfq5p container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.76:8080/\": dial tcp 10.128.0.76:8080: connect: connection refused" start-of-body= Mar 19 09:26:07.505812 master-0 kubenswrapper[13205]: I0319 09:26:07.505691 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-hfq5p" podUID="437ab63c-8bc0-4761-81fd-0da0052a9628" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.76:8080/\": dial tcp 10.128.0.76:8080: connect: connection refused" Mar 19 09:26:08.510921 master-0 kubenswrapper[13205]: I0319 09:26:08.510862 13205 patch_prober.go:28] interesting pod/downloads-66b8ffb895-hfq5p container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.76:8080/\": dial tcp 10.128.0.76:8080: connect: connection refused" start-of-body= Mar 19 09:26:08.511560 master-0 kubenswrapper[13205]: I0319 09:26:08.510949 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-hfq5p" podUID="437ab63c-8bc0-4761-81fd-0da0052a9628" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.76:8080/\": dial tcp 10.128.0.76:8080: connect: connection refused" Mar 19 09:26:12.557674 master-0 kubenswrapper[13205]: I0319 09:26:12.557606 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c6b76c555-2k849" event={"ID":"cc049aa3-0344-4031-a38d-05ad0ac0cf4f","Type":"ContainerStarted","Data":"7e596d53d9a01611486645a5e23ea50cc902d0a002cb83ee2ef7e969d6c5412d"} Mar 19 09:26:13.699601 master-0 kubenswrapper[13205]: I0319 09:26:13.699510 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-66b8ffb895-hfq5p" Mar 19 09:26:13.785447 master-0 kubenswrapper[13205]: I0319 09:26:13.784356 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-7c6b76c555-2k849" podStartSLOduration=2.689484773 podStartE2EDuration="7.784334834s" podCreationTimestamp="2026-03-19 09:26:06 +0000 UTC" firstStartedPulling="2026-03-19 09:26:07.076479632 +0000 UTC m=+152.408786520" lastFinishedPulling="2026-03-19 09:26:12.171329693 +0000 UTC m=+157.503636581" observedRunningTime="2026-03-19 09:26:13.750747736 +0000 UTC m=+159.083054634" watchObservedRunningTime="2026-03-19 09:26:13.784334834 +0000 UTC m=+159.116641732" Mar 19 09:26:22.689812 master-0 kubenswrapper[13205]: I0319 09:26:22.689757 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:26:22.696042 master-0 kubenswrapper[13205]: I0319 09:26:22.696011 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-fd9c84ccc-jlrzf" Mar 19 09:26:26.026732 master-0 kubenswrapper[13205]: E0319 09:26:26.026684 13205 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 19 09:26:26.028145 master-0 kubenswrapper[13205]: I0319 09:26:26.028116 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:26:26.029132 master-0 kubenswrapper[13205]: I0319 09:26:26.029101 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:26:26.029287 master-0 kubenswrapper[13205]: I0319 09:26:26.029252 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.030500 master-0 kubenswrapper[13205]: I0319 09:26:26.029428 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a049db050ca6bc5dd5515a0d06b921d1384ccbf52c62dfbd39beb94582593630" gracePeriod=15 Mar 19 09:26:26.030500 master-0 kubenswrapper[13205]: I0319 09:26:26.029463 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://436d58e6f0459cedb2b40144fa2e1b34bdded6188e821f819ffb9f703598e7f8" gracePeriod=15 Mar 19 09:26:26.030500 master-0 kubenswrapper[13205]: I0319 09:26:26.029485 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://be92726d6f0c98f7c12d2824a7420cf3610a10c520bd61727829d8dfca05705b" gracePeriod=15 Mar 19 09:26:26.030500 master-0 kubenswrapper[13205]: I0319 09:26:26.029582 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f4cdd4aa5ee6e9313f94584f1cf8c5b3250ea77616cd4e5e1e587652e3d1cce0" gracePeriod=15 Mar 19 09:26:26.030500 master-0 kubenswrapper[13205]: I0319 09:26:26.029410 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver" containerID="cri-o://614bc917df6caced8727707d61735f1ca6262f5504d8db0f43fd1ffc7c30fbd5" gracePeriod=15 Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: I0319 09:26:26.031909 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: E0319 09:26:26.032227 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-insecure-readyz" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: I0319 09:26:26.032241 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-insecure-readyz" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: E0319 09:26:26.032254 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: I0319 09:26:26.032263 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: E0319 09:26:26.032277 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: I0319 09:26:26.032285 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: E0319 09:26:26.032295 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="setup" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: I0319 09:26:26.032303 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="setup" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: E0319 09:26:26.032319 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-check-endpoints" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: I0319 09:26:26.032326 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-check-endpoints" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: E0319 09:26:26.032337 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-cert-syncer" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: I0319 09:26:26.032345 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-cert-syncer" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: I0319 09:26:26.032487 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-check-endpoints" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: I0319 09:26:26.032508 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: I0319 09:26:26.032536 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-insecure-readyz" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: I0319 09:26:26.032551 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-cert-syncer" Mar 19 09:26:26.034248 master-0 kubenswrapper[13205]: I0319 09:26:26.032583 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3507630eeeca1ec26dca5ed036e3bb" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:26:26.085317 master-0 kubenswrapper[13205]: I0319 09:26:26.085251 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.085468 master-0 kubenswrapper[13205]: I0319 09:26:26.085327 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.085468 master-0 kubenswrapper[13205]: I0319 09:26:26.085356 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.085807 master-0 kubenswrapper[13205]: I0319 09:26:26.085755 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:26.086095 master-0 kubenswrapper[13205]: I0319 09:26:26.085974 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.086161 master-0 kubenswrapper[13205]: I0319 09:26:26.086135 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.086449 master-0 kubenswrapper[13205]: I0319 09:26:26.086423 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:26.086517 master-0 kubenswrapper[13205]: I0319 09:26:26.086462 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:26.116384 master-0 kubenswrapper[13205]: E0319 09:26:26.116248 13205 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.188360 master-0 kubenswrapper[13205]: I0319 09:26:26.188298 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:26.188360 master-0 kubenswrapper[13205]: I0319 09:26:26.188361 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.188579 master-0 kubenswrapper[13205]: I0319 09:26:26.188390 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.188579 master-0 kubenswrapper[13205]: I0319 09:26:26.188434 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:26.188579 master-0 kubenswrapper[13205]: I0319 09:26:26.188455 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:26.188579 master-0 kubenswrapper[13205]: I0319 09:26:26.188475 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.188579 master-0 kubenswrapper[13205]: I0319 09:26:26.188506 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.188579 master-0 kubenswrapper[13205]: I0319 09:26:26.188541 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.188744 master-0 kubenswrapper[13205]: I0319 09:26:26.188623 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.188744 master-0 kubenswrapper[13205]: I0319 09:26:26.188669 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:26.188744 master-0 kubenswrapper[13205]: I0319 09:26:26.188696 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.188744 master-0 kubenswrapper[13205]: I0319 09:26:26.188721 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.188857 master-0 kubenswrapper[13205]: I0319 09:26:26.188745 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:26.188857 master-0 kubenswrapper[13205]: I0319 09:26:26.188773 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:26.188857 master-0 kubenswrapper[13205]: I0319 09:26:26.188804 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.188857 master-0 kubenswrapper[13205]: I0319 09:26:26.188832 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.418106 master-0 kubenswrapper[13205]: I0319 09:26:26.417970 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:26.451848 master-0 kubenswrapper[13205]: W0319 09:26:26.451791 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7a82869988463543d3d8dd1f0b5fe3.slice/crio-2712eb527e179590962183e48f5fbe69f7a66e4f06a508c66e6808e61cb41e97 WatchSource:0}: Error finding container 2712eb527e179590962183e48f5fbe69f7a66e4f06a508c66e6808e61cb41e97: Status 404 returned error can't find the container with id 2712eb527e179590962183e48f5fbe69f7a66e4f06a508c66e6808e61cb41e97 Mar 19 09:26:26.455313 master-0 kubenswrapper[13205]: E0319 09:26:26.455182 13205 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e33eab945e096 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:8e7a82869988463543d3d8dd1f0b5fe3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:26:26.45428239 +0000 UTC m=+171.786589278,LastTimestamp:2026-03-19 09:26:26.45428239 +0000 UTC m=+171.786589278,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:26:26.658003 master-0 kubenswrapper[13205]: I0319 09:26:26.657927 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_ac3507630eeeca1ec26dca5ed036e3bb/kube-apiserver-cert-syncer/0.log" Mar 19 09:26:26.658854 master-0 kubenswrapper[13205]: I0319 09:26:26.658817 13205 generic.go:334] "Generic (PLEG): container finished" podID="ac3507630eeeca1ec26dca5ed036e3bb" containerID="a049db050ca6bc5dd5515a0d06b921d1384ccbf52c62dfbd39beb94582593630" exitCode=0 Mar 19 09:26:26.658854 master-0 kubenswrapper[13205]: I0319 09:26:26.658851 13205 generic.go:334] "Generic (PLEG): container finished" podID="ac3507630eeeca1ec26dca5ed036e3bb" containerID="436d58e6f0459cedb2b40144fa2e1b34bdded6188e821f819ffb9f703598e7f8" exitCode=0 Mar 19 09:26:26.658972 master-0 kubenswrapper[13205]: I0319 09:26:26.658859 13205 generic.go:334] "Generic (PLEG): container finished" podID="ac3507630eeeca1ec26dca5ed036e3bb" containerID="be92726d6f0c98f7c12d2824a7420cf3610a10c520bd61727829d8dfca05705b" exitCode=0 Mar 19 09:26:26.658972 master-0 kubenswrapper[13205]: I0319 09:26:26.658868 13205 generic.go:334] "Generic (PLEG): container finished" podID="ac3507630eeeca1ec26dca5ed036e3bb" containerID="f4cdd4aa5ee6e9313f94584f1cf8c5b3250ea77616cd4e5e1e587652e3d1cce0" exitCode=2 Mar 19 09:26:26.660471 master-0 kubenswrapper[13205]: I0319 09:26:26.660432 13205 generic.go:334] "Generic (PLEG): container finished" podID="0f12c099-d9a7-48a9-9965-c339c4e32d31" containerID="d5263ec6b799cc073e1945d22bbc2ea8e25dd090a5b022d429fcdd2f5e70a626" exitCode=0 Mar 19 09:26:26.660585 master-0 kubenswrapper[13205]: I0319 09:26:26.660502 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"0f12c099-d9a7-48a9-9965-c339c4e32d31","Type":"ContainerDied","Data":"d5263ec6b799cc073e1945d22bbc2ea8e25dd090a5b022d429fcdd2f5e70a626"} Mar 19 09:26:26.662016 master-0 kubenswrapper[13205]: I0319 09:26:26.661949 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"2712eb527e179590962183e48f5fbe69f7a66e4f06a508c66e6808e61cb41e97"} Mar 19 09:26:26.662156 master-0 kubenswrapper[13205]: I0319 09:26:26.662120 13205 status_manager.go:851] "Failed to get status for pod" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:26.663181 master-0 kubenswrapper[13205]: I0319 09:26:26.663135 13205 status_manager.go:851] "Failed to get status for pod" podUID="ac3507630eeeca1ec26dca5ed036e3bb" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:27.670402 master-0 kubenswrapper[13205]: I0319 09:26:27.670330 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"21f8e3fb8b8f4a721bfc75169c91fa6e54e02924bb118e4fafe0a8bc341e366c"} Mar 19 09:26:27.671943 master-0 kubenswrapper[13205]: I0319 09:26:27.671567 13205 status_manager.go:851] "Failed to get status for pod" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:27.671943 master-0 kubenswrapper[13205]: E0319 09:26:27.671657 13205 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:27.984540 master-0 kubenswrapper[13205]: I0319 09:26:27.984488 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:26:27.985352 master-0 kubenswrapper[13205]: I0319 09:26:27.985301 13205 status_manager.go:851] "Failed to get status for pod" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:28.116847 master-0 kubenswrapper[13205]: I0319 09:26:28.116748 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f12c099-d9a7-48a9-9965-c339c4e32d31-var-lock\") pod \"0f12c099-d9a7-48a9-9965-c339c4e32d31\" (UID: \"0f12c099-d9a7-48a9-9965-c339c4e32d31\") " Mar 19 09:26:28.116847 master-0 kubenswrapper[13205]: I0319 09:26:28.116845 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f12c099-d9a7-48a9-9965-c339c4e32d31-kube-api-access\") pod \"0f12c099-d9a7-48a9-9965-c339c4e32d31\" (UID: \"0f12c099-d9a7-48a9-9965-c339c4e32d31\") " Mar 19 09:26:28.117239 master-0 kubenswrapper[13205]: I0319 09:26:28.116978 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f12c099-d9a7-48a9-9965-c339c4e32d31-kubelet-dir\") pod \"0f12c099-d9a7-48a9-9965-c339c4e32d31\" (UID: \"0f12c099-d9a7-48a9-9965-c339c4e32d31\") " Mar 19 09:26:28.117239 master-0 kubenswrapper[13205]: I0319 09:26:28.116968 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f12c099-d9a7-48a9-9965-c339c4e32d31-var-lock" (OuterVolumeSpecName: "var-lock") pod "0f12c099-d9a7-48a9-9965-c339c4e32d31" (UID: "0f12c099-d9a7-48a9-9965-c339c4e32d31"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:26:28.117239 master-0 kubenswrapper[13205]: I0319 09:26:28.117183 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f12c099-d9a7-48a9-9965-c339c4e32d31-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0f12c099-d9a7-48a9-9965-c339c4e32d31" (UID: "0f12c099-d9a7-48a9-9965-c339c4e32d31"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:26:28.130020 master-0 kubenswrapper[13205]: I0319 09:26:28.129928 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f12c099-d9a7-48a9-9965-c339c4e32d31-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0f12c099-d9a7-48a9-9965-c339c4e32d31" (UID: "0f12c099-d9a7-48a9-9965-c339c4e32d31"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:26:28.131066 master-0 kubenswrapper[13205]: I0319 09:26:28.131005 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f12c099-d9a7-48a9-9965-c339c4e32d31-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:28.131066 master-0 kubenswrapper[13205]: I0319 09:26:28.131061 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f12c099-d9a7-48a9-9965-c339c4e32d31-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:28.131319 master-0 kubenswrapper[13205]: I0319 09:26:28.131082 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f12c099-d9a7-48a9-9965-c339c4e32d31-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:28.532187 master-0 kubenswrapper[13205]: I0319 09:26:28.532140 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_ac3507630eeeca1ec26dca5ed036e3bb/kube-apiserver-cert-syncer/0.log" Mar 19 09:26:28.533271 master-0 kubenswrapper[13205]: I0319 09:26:28.533227 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:28.535013 master-0 kubenswrapper[13205]: I0319 09:26:28.534963 13205 status_manager.go:851] "Failed to get status for pod" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:28.535754 master-0 kubenswrapper[13205]: I0319 09:26:28.535691 13205 status_manager.go:851] "Failed to get status for pod" podUID="ac3507630eeeca1ec26dca5ed036e3bb" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:28.656176 master-0 kubenswrapper[13205]: I0319 09:26:28.655926 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-cert-dir\") pod \"ac3507630eeeca1ec26dca5ed036e3bb\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " Mar 19 09:26:28.656176 master-0 kubenswrapper[13205]: I0319 09:26:28.656033 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-audit-dir\") pod \"ac3507630eeeca1ec26dca5ed036e3bb\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " Mar 19 09:26:28.656176 master-0 kubenswrapper[13205]: I0319 09:26:28.656095 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "ac3507630eeeca1ec26dca5ed036e3bb" (UID: "ac3507630eeeca1ec26dca5ed036e3bb"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:26:28.656966 master-0 kubenswrapper[13205]: I0319 09:26:28.656339 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ac3507630eeeca1ec26dca5ed036e3bb" (UID: "ac3507630eeeca1ec26dca5ed036e3bb"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:26:28.656966 master-0 kubenswrapper[13205]: I0319 09:26:28.656489 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-resource-dir\") pod \"ac3507630eeeca1ec26dca5ed036e3bb\" (UID: \"ac3507630eeeca1ec26dca5ed036e3bb\") " Mar 19 09:26:28.656966 master-0 kubenswrapper[13205]: I0319 09:26:28.656707 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "ac3507630eeeca1ec26dca5ed036e3bb" (UID: "ac3507630eeeca1ec26dca5ed036e3bb"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:26:28.657454 master-0 kubenswrapper[13205]: I0319 09:26:28.657131 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:28.657454 master-0 kubenswrapper[13205]: I0319 09:26:28.657153 13205 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:28.657454 master-0 kubenswrapper[13205]: I0319 09:26:28.657165 13205 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac3507630eeeca1ec26dca5ed036e3bb-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:28.684015 master-0 kubenswrapper[13205]: I0319 09:26:28.683912 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"0f12c099-d9a7-48a9-9965-c339c4e32d31","Type":"ContainerDied","Data":"d4d4e5484af07923bd4b1ed02e7c532a21065e50ad6a24eb1106acca6ea29449"} Mar 19 09:26:28.684015 master-0 kubenswrapper[13205]: I0319 09:26:28.683996 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:26:28.684812 master-0 kubenswrapper[13205]: I0319 09:26:28.684003 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4d4e5484af07923bd4b1ed02e7c532a21065e50ad6a24eb1106acca6ea29449" Mar 19 09:26:28.689019 master-0 kubenswrapper[13205]: I0319 09:26:28.688970 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_ac3507630eeeca1ec26dca5ed036e3bb/kube-apiserver-cert-syncer/0.log" Mar 19 09:26:28.690185 master-0 kubenswrapper[13205]: I0319 09:26:28.690077 13205 generic.go:334] "Generic (PLEG): container finished" podID="ac3507630eeeca1ec26dca5ed036e3bb" containerID="614bc917df6caced8727707d61735f1ca6262f5504d8db0f43fd1ffc7c30fbd5" exitCode=0 Mar 19 09:26:28.690410 master-0 kubenswrapper[13205]: I0319 09:26:28.690202 13205 scope.go:117] "RemoveContainer" containerID="a049db050ca6bc5dd5515a0d06b921d1384ccbf52c62dfbd39beb94582593630" Mar 19 09:26:28.690941 master-0 kubenswrapper[13205]: I0319 09:26:28.690745 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:28.693694 master-0 kubenswrapper[13205]: E0319 09:26:28.692275 13205 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:26:28.713632 master-0 kubenswrapper[13205]: I0319 09:26:28.713564 13205 status_manager.go:851] "Failed to get status for pod" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:28.714576 master-0 kubenswrapper[13205]: I0319 09:26:28.714491 13205 status_manager.go:851] "Failed to get status for pod" podUID="ac3507630eeeca1ec26dca5ed036e3bb" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:28.726215 master-0 kubenswrapper[13205]: I0319 09:26:28.726064 13205 scope.go:117] "RemoveContainer" containerID="436d58e6f0459cedb2b40144fa2e1b34bdded6188e821f819ffb9f703598e7f8" Mar 19 09:26:28.728483 master-0 kubenswrapper[13205]: I0319 09:26:28.728440 13205 status_manager.go:851] "Failed to get status for pod" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:28.728982 master-0 kubenswrapper[13205]: I0319 09:26:28.728946 13205 status_manager.go:851] "Failed to get status for pod" podUID="ac3507630eeeca1ec26dca5ed036e3bb" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:28.748624 master-0 kubenswrapper[13205]: I0319 09:26:28.748579 13205 scope.go:117] "RemoveContainer" containerID="be92726d6f0c98f7c12d2824a7420cf3610a10c520bd61727829d8dfca05705b" Mar 19 09:26:28.767785 master-0 kubenswrapper[13205]: I0319 09:26:28.767744 13205 scope.go:117] "RemoveContainer" containerID="f4cdd4aa5ee6e9313f94584f1cf8c5b3250ea77616cd4e5e1e587652e3d1cce0" Mar 19 09:26:28.782559 master-0 kubenswrapper[13205]: I0319 09:26:28.782449 13205 scope.go:117] "RemoveContainer" containerID="614bc917df6caced8727707d61735f1ca6262f5504d8db0f43fd1ffc7c30fbd5" Mar 19 09:26:28.799097 master-0 kubenswrapper[13205]: I0319 09:26:28.798994 13205 scope.go:117] "RemoveContainer" containerID="fe4aa11ba9b87ba831dffe9e66c7f29d228e243f676763cd967178740391f529" Mar 19 09:26:28.814390 master-0 kubenswrapper[13205]: I0319 09:26:28.814340 13205 scope.go:117] "RemoveContainer" containerID="a049db050ca6bc5dd5515a0d06b921d1384ccbf52c62dfbd39beb94582593630" Mar 19 09:26:28.814808 master-0 kubenswrapper[13205]: E0319 09:26:28.814765 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a049db050ca6bc5dd5515a0d06b921d1384ccbf52c62dfbd39beb94582593630\": container with ID starting with a049db050ca6bc5dd5515a0d06b921d1384ccbf52c62dfbd39beb94582593630 not found: ID does not exist" containerID="a049db050ca6bc5dd5515a0d06b921d1384ccbf52c62dfbd39beb94582593630" Mar 19 09:26:28.815078 master-0 kubenswrapper[13205]: I0319 09:26:28.814803 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a049db050ca6bc5dd5515a0d06b921d1384ccbf52c62dfbd39beb94582593630"} err="failed to get container status \"a049db050ca6bc5dd5515a0d06b921d1384ccbf52c62dfbd39beb94582593630\": rpc error: code = NotFound desc = could not find container \"a049db050ca6bc5dd5515a0d06b921d1384ccbf52c62dfbd39beb94582593630\": container with ID starting with a049db050ca6bc5dd5515a0d06b921d1384ccbf52c62dfbd39beb94582593630 not found: ID does not exist" Mar 19 09:26:28.815078 master-0 kubenswrapper[13205]: I0319 09:26:28.814826 13205 scope.go:117] "RemoveContainer" containerID="436d58e6f0459cedb2b40144fa2e1b34bdded6188e821f819ffb9f703598e7f8" Mar 19 09:26:28.815254 master-0 kubenswrapper[13205]: E0319 09:26:28.815138 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"436d58e6f0459cedb2b40144fa2e1b34bdded6188e821f819ffb9f703598e7f8\": container with ID starting with 436d58e6f0459cedb2b40144fa2e1b34bdded6188e821f819ffb9f703598e7f8 not found: ID does not exist" containerID="436d58e6f0459cedb2b40144fa2e1b34bdded6188e821f819ffb9f703598e7f8" Mar 19 09:26:28.815254 master-0 kubenswrapper[13205]: I0319 09:26:28.815178 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"436d58e6f0459cedb2b40144fa2e1b34bdded6188e821f819ffb9f703598e7f8"} err="failed to get container status \"436d58e6f0459cedb2b40144fa2e1b34bdded6188e821f819ffb9f703598e7f8\": rpc error: code = NotFound desc = could not find container \"436d58e6f0459cedb2b40144fa2e1b34bdded6188e821f819ffb9f703598e7f8\": container with ID starting with 436d58e6f0459cedb2b40144fa2e1b34bdded6188e821f819ffb9f703598e7f8 not found: ID does not exist" Mar 19 09:26:28.815254 master-0 kubenswrapper[13205]: I0319 09:26:28.815205 13205 scope.go:117] "RemoveContainer" containerID="be92726d6f0c98f7c12d2824a7420cf3610a10c520bd61727829d8dfca05705b" Mar 19 09:26:28.815755 master-0 kubenswrapper[13205]: E0319 09:26:28.815518 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be92726d6f0c98f7c12d2824a7420cf3610a10c520bd61727829d8dfca05705b\": container with ID starting with be92726d6f0c98f7c12d2824a7420cf3610a10c520bd61727829d8dfca05705b not found: ID does not exist" containerID="be92726d6f0c98f7c12d2824a7420cf3610a10c520bd61727829d8dfca05705b" Mar 19 09:26:28.815755 master-0 kubenswrapper[13205]: I0319 09:26:28.815555 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be92726d6f0c98f7c12d2824a7420cf3610a10c520bd61727829d8dfca05705b"} err="failed to get container status \"be92726d6f0c98f7c12d2824a7420cf3610a10c520bd61727829d8dfca05705b\": rpc error: code = NotFound desc = could not find container \"be92726d6f0c98f7c12d2824a7420cf3610a10c520bd61727829d8dfca05705b\": container with ID starting with be92726d6f0c98f7c12d2824a7420cf3610a10c520bd61727829d8dfca05705b not found: ID does not exist" Mar 19 09:26:28.815755 master-0 kubenswrapper[13205]: I0319 09:26:28.815569 13205 scope.go:117] "RemoveContainer" containerID="f4cdd4aa5ee6e9313f94584f1cf8c5b3250ea77616cd4e5e1e587652e3d1cce0" Mar 19 09:26:28.815977 master-0 kubenswrapper[13205]: E0319 09:26:28.815946 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4cdd4aa5ee6e9313f94584f1cf8c5b3250ea77616cd4e5e1e587652e3d1cce0\": container with ID starting with f4cdd4aa5ee6e9313f94584f1cf8c5b3250ea77616cd4e5e1e587652e3d1cce0 not found: ID does not exist" containerID="f4cdd4aa5ee6e9313f94584f1cf8c5b3250ea77616cd4e5e1e587652e3d1cce0" Mar 19 09:26:28.816023 master-0 kubenswrapper[13205]: I0319 09:26:28.815976 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4cdd4aa5ee6e9313f94584f1cf8c5b3250ea77616cd4e5e1e587652e3d1cce0"} err="failed to get container status \"f4cdd4aa5ee6e9313f94584f1cf8c5b3250ea77616cd4e5e1e587652e3d1cce0\": rpc error: code = NotFound desc = could not find container \"f4cdd4aa5ee6e9313f94584f1cf8c5b3250ea77616cd4e5e1e587652e3d1cce0\": container with ID starting with f4cdd4aa5ee6e9313f94584f1cf8c5b3250ea77616cd4e5e1e587652e3d1cce0 not found: ID does not exist" Mar 19 09:26:28.816023 master-0 kubenswrapper[13205]: I0319 09:26:28.815994 13205 scope.go:117] "RemoveContainer" containerID="614bc917df6caced8727707d61735f1ca6262f5504d8db0f43fd1ffc7c30fbd5" Mar 19 09:26:28.816434 master-0 kubenswrapper[13205]: E0319 09:26:28.816350 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614bc917df6caced8727707d61735f1ca6262f5504d8db0f43fd1ffc7c30fbd5\": container with ID starting with 614bc917df6caced8727707d61735f1ca6262f5504d8db0f43fd1ffc7c30fbd5 not found: ID does not exist" containerID="614bc917df6caced8727707d61735f1ca6262f5504d8db0f43fd1ffc7c30fbd5" Mar 19 09:26:28.816490 master-0 kubenswrapper[13205]: I0319 09:26:28.816440 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614bc917df6caced8727707d61735f1ca6262f5504d8db0f43fd1ffc7c30fbd5"} err="failed to get container status \"614bc917df6caced8727707d61735f1ca6262f5504d8db0f43fd1ffc7c30fbd5\": rpc error: code = NotFound desc = could not find container \"614bc917df6caced8727707d61735f1ca6262f5504d8db0f43fd1ffc7c30fbd5\": container with ID starting with 614bc917df6caced8727707d61735f1ca6262f5504d8db0f43fd1ffc7c30fbd5 not found: ID does not exist" Mar 19 09:26:28.816566 master-0 kubenswrapper[13205]: I0319 09:26:28.816495 13205 scope.go:117] "RemoveContainer" containerID="fe4aa11ba9b87ba831dffe9e66c7f29d228e243f676763cd967178740391f529" Mar 19 09:26:28.816999 master-0 kubenswrapper[13205]: E0319 09:26:28.816959 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe4aa11ba9b87ba831dffe9e66c7f29d228e243f676763cd967178740391f529\": container with ID starting with fe4aa11ba9b87ba831dffe9e66c7f29d228e243f676763cd967178740391f529 not found: ID does not exist" containerID="fe4aa11ba9b87ba831dffe9e66c7f29d228e243f676763cd967178740391f529" Mar 19 09:26:28.816999 master-0 kubenswrapper[13205]: I0319 09:26:28.816987 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe4aa11ba9b87ba831dffe9e66c7f29d228e243f676763cd967178740391f529"} err="failed to get container status \"fe4aa11ba9b87ba831dffe9e66c7f29d228e243f676763cd967178740391f529\": rpc error: code = NotFound desc = could not find container \"fe4aa11ba9b87ba831dffe9e66c7f29d228e243f676763cd967178740391f529\": container with ID starting with fe4aa11ba9b87ba831dffe9e66c7f29d228e243f676763cd967178740391f529 not found: ID does not exist" Mar 19 09:26:28.856559 master-0 kubenswrapper[13205]: I0319 09:26:28.856469 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3507630eeeca1ec26dca5ed036e3bb" path="/var/lib/kubelet/pods/ac3507630eeeca1ec26dca5ed036e3bb/volumes" Mar 19 09:26:30.359561 master-0 kubenswrapper[13205]: E0319 09:26:30.359320 13205 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e33eab945e096 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:8e7a82869988463543d3d8dd1f0b5fe3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:26:26.45428239 +0000 UTC m=+171.786589278,LastTimestamp:2026-03-19 09:26:26.45428239 +0000 UTC m=+171.786589278,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:26:34.314092 master-0 kubenswrapper[13205]: E0319 09:26:34.313991 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:34.315181 master-0 kubenswrapper[13205]: E0319 09:26:34.314836 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:34.315685 master-0 kubenswrapper[13205]: E0319 09:26:34.315630 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:34.316355 master-0 kubenswrapper[13205]: E0319 09:26:34.316287 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:34.317200 master-0 kubenswrapper[13205]: E0319 09:26:34.317114 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:34.317200 master-0 kubenswrapper[13205]: I0319 09:26:34.317174 13205 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 09:26:34.317905 master-0 kubenswrapper[13205]: E0319 09:26:34.317837 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 09:26:34.519878 master-0 kubenswrapper[13205]: E0319 09:26:34.519765 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 09:26:34.862086 master-0 kubenswrapper[13205]: I0319 09:26:34.861977 13205 status_manager.go:851] "Failed to get status for pod" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:34.920916 master-0 kubenswrapper[13205]: E0319 09:26:34.920835 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 09:26:35.210234 master-0 kubenswrapper[13205]: E0319 09:26:35.210115 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:26:35.723245 master-0 kubenswrapper[13205]: E0319 09:26:35.723167 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 09:26:37.324972 master-0 kubenswrapper[13205]: E0319 09:26:37.324915 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 09:26:38.551111 master-0 kubenswrapper[13205]: E0319 09:26:38.551042 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:26:38Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:26:38Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:26:38Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:26:38Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:38.551759 master-0 kubenswrapper[13205]: E0319 09:26:38.551721 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:38.554773 master-0 kubenswrapper[13205]: E0319 09:26:38.552294 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:38.554773 master-0 kubenswrapper[13205]: E0319 09:26:38.552900 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:38.554773 master-0 kubenswrapper[13205]: E0319 09:26:38.553343 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:38.554773 master-0 kubenswrapper[13205]: E0319 09:26:38.553362 13205 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:26:38.770915 master-0 kubenswrapper[13205]: I0319 09:26:38.770822 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/1.log" Mar 19 09:26:38.772651 master-0 kubenswrapper[13205]: I0319 09:26:38.772612 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/0.log" Mar 19 09:26:38.772774 master-0 kubenswrapper[13205]: I0319 09:26:38.772663 13205 generic.go:334] "Generic (PLEG): container finished" podID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerID="96cfe0cf7dfe0d98d352c2ad678b9567500f91431662731fe6673b6785c78fae" exitCode=1 Mar 19 09:26:38.772774 master-0 kubenswrapper[13205]: I0319 09:26:38.772697 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerDied","Data":"96cfe0cf7dfe0d98d352c2ad678b9567500f91431662731fe6673b6785c78fae"} Mar 19 09:26:38.772774 master-0 kubenswrapper[13205]: I0319 09:26:38.772732 13205 scope.go:117] "RemoveContainer" containerID="9889603cf425a1afe622f697ec4d233d82f7e355b75cc078b65e38e02fed7bd5" Mar 19 09:26:38.773676 master-0 kubenswrapper[13205]: I0319 09:26:38.773605 13205 scope.go:117] "RemoveContainer" containerID="96cfe0cf7dfe0d98d352c2ad678b9567500f91431662731fe6673b6785c78fae" Mar 19 09:26:38.774593 master-0 kubenswrapper[13205]: E0319 09:26:38.773967 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:26:38.776072 master-0 kubenswrapper[13205]: I0319 09:26:38.775989 13205 status_manager.go:851] "Failed to get status for pod" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:38.776807 master-0 kubenswrapper[13205]: I0319 09:26:38.776745 13205 status_manager.go:851] "Failed to get status for pod" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:39.781555 master-0 kubenswrapper[13205]: I0319 09:26:39.781483 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/1.log" Mar 19 09:26:40.361397 master-0 kubenswrapper[13205]: E0319 09:26:40.361160 13205 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e33eab945e096 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:8e7a82869988463543d3d8dd1f0b5fe3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:26:26.45428239 +0000 UTC m=+171.786589278,LastTimestamp:2026-03-19 09:26:26.45428239 +0000 UTC m=+171.786589278,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:26:40.527012 master-0 kubenswrapper[13205]: E0319 09:26:40.526916 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 19 09:26:40.849353 master-0 kubenswrapper[13205]: I0319 09:26:40.849275 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:40.851771 master-0 kubenswrapper[13205]: I0319 09:26:40.851665 13205 status_manager.go:851] "Failed to get status for pod" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:40.852957 master-0 kubenswrapper[13205]: I0319 09:26:40.852896 13205 status_manager.go:851] "Failed to get status for pod" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:40.876550 master-0 kubenswrapper[13205]: I0319 09:26:40.876429 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="30ca6111-2c23-4604-b205-53ab031af043" Mar 19 09:26:40.876550 master-0 kubenswrapper[13205]: I0319 09:26:40.876494 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="30ca6111-2c23-4604-b205-53ab031af043" Mar 19 09:26:40.877972 master-0 kubenswrapper[13205]: E0319 09:26:40.877891 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:40.878751 master-0 kubenswrapper[13205]: I0319 09:26:40.878703 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:40.918256 master-0 kubenswrapper[13205]: W0319 09:26:40.918166 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45ea2ef1cf2bc9d1d994d6538ae0a64.slice/crio-8d5daa22dd60a131f85b0dbb133c9da29987e5057371cc24753d7ef2b7951181 WatchSource:0}: Error finding container 8d5daa22dd60a131f85b0dbb133c9da29987e5057371cc24753d7ef2b7951181: Status 404 returned error can't find the container with id 8d5daa22dd60a131f85b0dbb133c9da29987e5057371cc24753d7ef2b7951181 Mar 19 09:26:41.808308 master-0 kubenswrapper[13205]: I0319 09:26:41.808187 13205 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="8567622300f2b47f4209289926c8cdce9e82c9c3e11d93e06f8c06befbb56ca9" exitCode=0 Mar 19 09:26:41.808308 master-0 kubenswrapper[13205]: I0319 09:26:41.808255 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerDied","Data":"8567622300f2b47f4209289926c8cdce9e82c9c3e11d93e06f8c06befbb56ca9"} Mar 19 09:26:41.808308 master-0 kubenswrapper[13205]: I0319 09:26:41.808295 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"8d5daa22dd60a131f85b0dbb133c9da29987e5057371cc24753d7ef2b7951181"} Mar 19 09:26:41.809103 master-0 kubenswrapper[13205]: I0319 09:26:41.809040 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="30ca6111-2c23-4604-b205-53ab031af043" Mar 19 09:26:41.809103 master-0 kubenswrapper[13205]: I0319 09:26:41.809082 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="30ca6111-2c23-4604-b205-53ab031af043" Mar 19 09:26:41.810244 master-0 kubenswrapper[13205]: E0319 09:26:41.810165 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:41.810244 master-0 kubenswrapper[13205]: I0319 09:26:41.810209 13205 status_manager.go:851] "Failed to get status for pod" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:41.811846 master-0 kubenswrapper[13205]: I0319 09:26:41.811279 13205 status_manager.go:851] "Failed to get status for pod" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:26:42.817468 master-0 kubenswrapper[13205]: I0319 09:26:42.817412 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"961b1c0b868cdebfe1af0e074eff504c2d3a8b6bd3a3a85e4be5eff862ba9e6d"} Mar 19 09:26:42.817468 master-0 kubenswrapper[13205]: I0319 09:26:42.817472 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52"} Mar 19 09:26:43.824936 master-0 kubenswrapper[13205]: I0319 09:26:43.824867 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"e603aaf263c06efbf271924eab1a41acb8214a13a832c2cab32da6d3f150d4a3"} Mar 19 09:26:43.824936 master-0 kubenswrapper[13205]: I0319 09:26:43.824910 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"4a9d1c85aa5dd20c0a66f3ffda113b9d28087bfc64b37395becd5b86def49894"} Mar 19 09:26:43.824936 master-0 kubenswrapper[13205]: I0319 09:26:43.824919 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"2ec01576dbefe9fc38b2d53b2fd0a23949980fd88014d22013159663680ba28a"} Mar 19 09:26:43.825717 master-0 kubenswrapper[13205]: I0319 09:26:43.825011 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:43.825717 master-0 kubenswrapper[13205]: I0319 09:26:43.825121 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="30ca6111-2c23-4604-b205-53ab031af043" Mar 19 09:26:43.825717 master-0 kubenswrapper[13205]: I0319 09:26:43.825138 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="30ca6111-2c23-4604-b205-53ab031af043" Mar 19 09:26:44.012475 master-0 kubenswrapper[13205]: I0319 09:26:44.012393 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:26:44.012985 master-0 kubenswrapper[13205]: I0319 09:26:44.012951 13205 scope.go:117] "RemoveContainer" containerID="96cfe0cf7dfe0d98d352c2ad678b9567500f91431662731fe6673b6785c78fae" Mar 19 09:26:44.013212 master-0 kubenswrapper[13205]: E0319 09:26:44.013185 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:26:44.840474 master-0 kubenswrapper[13205]: I0319 09:26:44.840438 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:26:44.841547 master-0 kubenswrapper[13205]: I0319 09:26:44.841519 13205 scope.go:117] "RemoveContainer" containerID="96cfe0cf7dfe0d98d352c2ad678b9567500f91431662731fe6673b6785c78fae" Mar 19 09:26:44.841864 master-0 kubenswrapper[13205]: E0319 09:26:44.841846 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:26:45.879929 master-0 kubenswrapper[13205]: I0319 09:26:45.879780 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:45.879929 master-0 kubenswrapper[13205]: I0319 09:26:45.879856 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:45.888590 master-0 kubenswrapper[13205]: I0319 09:26:45.888507 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:47.631115 master-0 kubenswrapper[13205]: I0319 09:26:47.631066 13205 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:26:47.631666 master-0 kubenswrapper[13205]: I0319 09:26:47.631596 13205 scope.go:117] "RemoveContainer" containerID="96cfe0cf7dfe0d98d352c2ad678b9567500f91431662731fe6673b6785c78fae" Mar 19 09:26:47.631983 master-0 kubenswrapper[13205]: E0319 09:26:47.631945 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:26:49.615362 master-0 kubenswrapper[13205]: I0319 09:26:49.615283 13205 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:49.763974 master-0 kubenswrapper[13205]: I0319 09:26:49.763897 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" podUID="13e99065-2a42-467c-8585-97de1dffebb8" Mar 19 09:26:50.001681 master-0 kubenswrapper[13205]: I0319 09:26:50.001474 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="30ca6111-2c23-4604-b205-53ab031af043" Mar 19 09:26:50.001681 master-0 kubenswrapper[13205]: I0319 09:26:50.001527 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="30ca6111-2c23-4604-b205-53ab031af043" Mar 19 09:26:50.005861 master-0 kubenswrapper[13205]: I0319 09:26:50.005803 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" podUID="13e99065-2a42-467c-8585-97de1dffebb8" Mar 19 09:26:50.007425 master-0 kubenswrapper[13205]: I0319 09:26:50.007383 13205 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-master-0" containerID="cri-o://298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52" Mar 19 09:26:50.007660 master-0 kubenswrapper[13205]: I0319 09:26:50.007630 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:26:51.009333 master-0 kubenswrapper[13205]: I0319 09:26:51.009255 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="30ca6111-2c23-4604-b205-53ab031af043" Mar 19 09:26:51.009333 master-0 kubenswrapper[13205]: I0319 09:26:51.009299 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="30ca6111-2c23-4604-b205-53ab031af043" Mar 19 09:26:51.011975 master-0 kubenswrapper[13205]: I0319 09:26:51.011911 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" podUID="13e99065-2a42-467c-8585-97de1dffebb8" Mar 19 09:26:58.176558 master-0 kubenswrapper[13205]: I0319 09:26:58.173166 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:26:58.831776 master-0 kubenswrapper[13205]: I0319 09:26:58.831701 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:26:58.849979 master-0 kubenswrapper[13205]: I0319 09:26:58.849921 13205 scope.go:117] "RemoveContainer" containerID="96cfe0cf7dfe0d98d352c2ad678b9567500f91431662731fe6673b6785c78fae" Mar 19 09:26:59.046369 master-0 kubenswrapper[13205]: I0319 09:26:59.046333 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-pvd7b" Mar 19 09:26:59.230946 master-0 kubenswrapper[13205]: I0319 09:26:59.230872 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:26:59.428010 master-0 kubenswrapper[13205]: I0319 09:26:59.427877 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 09:26:59.512400 master-0 kubenswrapper[13205]: I0319 09:26:59.512297 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:26:59.701240 master-0 kubenswrapper[13205]: I0319 09:26:59.701118 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:26:59.725909 master-0 kubenswrapper[13205]: I0319 09:26:59.725869 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 09:27:00.069278 master-0 kubenswrapper[13205]: I0319 09:27:00.069242 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/1.log" Mar 19 09:27:00.070711 master-0 kubenswrapper[13205]: I0319 09:27:00.070654 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"3694e4ab2ea2f543365f25e3f482176aa9345099d6e0f60c0e896413215ced6f"} Mar 19 09:27:00.252567 master-0 kubenswrapper[13205]: I0319 09:27:00.252492 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:27:00.449961 master-0 kubenswrapper[13205]: I0319 09:27:00.449609 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-9tw96" Mar 19 09:27:00.579947 master-0 kubenswrapper[13205]: I0319 09:27:00.579890 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 09:27:00.723586 master-0 kubenswrapper[13205]: I0319 09:27:00.723446 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:27:00.752484 master-0 kubenswrapper[13205]: I0319 09:27:00.752434 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 09:27:00.975777 master-0 kubenswrapper[13205]: I0319 09:27:00.975645 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 09:27:00.980300 master-0 kubenswrapper[13205]: I0319 09:27:00.980252 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 09:27:01.076451 master-0 kubenswrapper[13205]: I0319 09:27:01.076405 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 09:27:01.167844 master-0 kubenswrapper[13205]: I0319 09:27:01.167777 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 09:27:01.212988 master-0 kubenswrapper[13205]: I0319 09:27:01.212922 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:27:01.272907 master-0 kubenswrapper[13205]: I0319 09:27:01.272856 13205 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:27:01.541219 master-0 kubenswrapper[13205]: I0319 09:27:01.541036 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 09:27:01.603651 master-0 kubenswrapper[13205]: I0319 09:27:01.603574 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-p5dd4" Mar 19 09:27:01.604922 master-0 kubenswrapper[13205]: I0319 09:27:01.604887 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 09:27:01.634552 master-0 kubenswrapper[13205]: I0319 09:27:01.634445 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:27:01.915883 master-0 kubenswrapper[13205]: I0319 09:27:01.915710 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-2zxtq" Mar 19 09:27:02.240474 master-0 kubenswrapper[13205]: I0319 09:27:02.240268 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 09:27:02.242760 master-0 kubenswrapper[13205]: I0319 09:27:02.242713 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-sjz5s" Mar 19 09:27:02.293066 master-0 kubenswrapper[13205]: I0319 09:27:02.292342 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:27:02.307052 master-0 kubenswrapper[13205]: I0319 09:27:02.306975 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 09:27:02.328031 master-0 kubenswrapper[13205]: I0319 09:27:02.327962 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 09:27:02.489359 master-0 kubenswrapper[13205]: I0319 09:27:02.489307 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 09:27:02.564832 master-0 kubenswrapper[13205]: I0319 09:27:02.564748 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:27:02.707776 master-0 kubenswrapper[13205]: I0319 09:27:02.706984 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gfzzh" Mar 19 09:27:02.740150 master-0 kubenswrapper[13205]: I0319 09:27:02.740113 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 09:27:02.756477 master-0 kubenswrapper[13205]: I0319 09:27:02.756436 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:27:02.813653 master-0 kubenswrapper[13205]: I0319 09:27:02.813590 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 09:27:02.857362 master-0 kubenswrapper[13205]: I0319 09:27:02.857229 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:27:02.898258 master-0 kubenswrapper[13205]: I0319 09:27:02.898196 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 09:27:02.962174 master-0 kubenswrapper[13205]: I0319 09:27:02.962117 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:27:03.055507 master-0 kubenswrapper[13205]: I0319 09:27:03.055446 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 09:27:03.090419 master-0 kubenswrapper[13205]: I0319 09:27:03.090354 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:27:03.119978 master-0 kubenswrapper[13205]: I0319 09:27:03.119832 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-lsnll" Mar 19 09:27:03.119978 master-0 kubenswrapper[13205]: I0319 09:27:03.119945 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 09:27:03.128565 master-0 kubenswrapper[13205]: I0319 09:27:03.128482 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-xh8t6" Mar 19 09:27:03.153768 master-0 kubenswrapper[13205]: I0319 09:27:03.153708 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:27:03.226998 master-0 kubenswrapper[13205]: I0319 09:27:03.226953 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:27:03.280513 master-0 kubenswrapper[13205]: I0319 09:27:03.280458 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 09:27:03.283214 master-0 kubenswrapper[13205]: I0319 09:27:03.283162 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 09:27:03.305237 master-0 kubenswrapper[13205]: I0319 09:27:03.305180 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 09:27:03.371166 master-0 kubenswrapper[13205]: I0319 09:27:03.371037 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 09:27:03.399784 master-0 kubenswrapper[13205]: I0319 09:27:03.399729 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-224sj" Mar 19 09:27:03.415959 master-0 kubenswrapper[13205]: I0319 09:27:03.415915 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:27:03.621349 master-0 kubenswrapper[13205]: I0319 09:27:03.621147 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 09:27:03.628268 master-0 kubenswrapper[13205]: I0319 09:27:03.628184 13205 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:27:03.629314 master-0 kubenswrapper[13205]: I0319 09:27:03.629280 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-mvv8v" Mar 19 09:27:03.635121 master-0 kubenswrapper[13205]: I0319 09:27:03.634989 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:27:03.635121 master-0 kubenswrapper[13205]: I0319 09:27:03.635068 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:27:03.640181 master-0 kubenswrapper[13205]: I0319 09:27:03.640145 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:27:03.678646 master-0 kubenswrapper[13205]: I0319 09:27:03.678594 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:27:03.704137 master-0 kubenswrapper[13205]: I0319 09:27:03.704083 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:27:03.765927 master-0 kubenswrapper[13205]: I0319 09:27:03.765861 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 09:27:03.809922 master-0 kubenswrapper[13205]: I0319 09:27:03.809841 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 09:27:03.813331 master-0 kubenswrapper[13205]: I0319 09:27:03.813176 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:27:03.819062 master-0 kubenswrapper[13205]: I0319 09:27:03.818984 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 09:27:03.847569 master-0 kubenswrapper[13205]: I0319 09:27:03.847465 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 09:27:03.858013 master-0 kubenswrapper[13205]: I0319 09:27:03.857965 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:27:03.895842 master-0 kubenswrapper[13205]: I0319 09:27:03.895693 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-45zpl" Mar 19 09:27:04.001401 master-0 kubenswrapper[13205]: I0319 09:27:04.001289 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:27:04.012634 master-0 kubenswrapper[13205]: I0319 09:27:04.012580 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:27:04.013111 master-0 kubenswrapper[13205]: I0319 09:27:04.013025 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 09:27:04.013189 master-0 kubenswrapper[13205]: I0319 09:27:04.013138 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 09:27:04.017984 master-0 kubenswrapper[13205]: I0319 09:27:04.017953 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:27:04.150831 master-0 kubenswrapper[13205]: I0319 09:27:04.149624 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 09:27:04.157033 master-0 kubenswrapper[13205]: I0319 09:27:04.156986 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 09:27:04.171868 master-0 kubenswrapper[13205]: I0319 09:27:04.171817 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 09:27:04.199510 master-0 kubenswrapper[13205]: I0319 09:27:04.199473 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:27:04.225741 master-0 kubenswrapper[13205]: I0319 09:27:04.225673 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:27:04.233870 master-0 kubenswrapper[13205]: I0319 09:27:04.233826 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:27:04.235263 master-0 kubenswrapper[13205]: I0319 09:27:04.235237 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 09:27:04.303623 master-0 kubenswrapper[13205]: I0319 09:27:04.303558 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:27:04.331521 master-0 kubenswrapper[13205]: I0319 09:27:04.331462 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 09:27:04.352054 master-0 kubenswrapper[13205]: I0319 09:27:04.352015 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 09:27:04.365380 master-0 kubenswrapper[13205]: I0319 09:27:04.365333 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-s8fs4" Mar 19 09:27:04.387268 master-0 kubenswrapper[13205]: I0319 09:27:04.387239 13205 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:27:04.407693 master-0 kubenswrapper[13205]: I0319 09:27:04.407554 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:27:04.411964 master-0 kubenswrapper[13205]: I0319 09:27:04.411922 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:27:04.477506 master-0 kubenswrapper[13205]: I0319 09:27:04.477432 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"tuned-dockercfg-czcgc" Mar 19 09:27:04.496811 master-0 kubenswrapper[13205]: I0319 09:27:04.496748 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 09:27:04.526237 master-0 kubenswrapper[13205]: I0319 09:27:04.526135 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 09:27:04.557099 master-0 kubenswrapper[13205]: I0319 09:27:04.556933 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:27:04.581123 master-0 kubenswrapper[13205]: I0319 09:27:04.581058 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:27:04.675219 master-0 kubenswrapper[13205]: I0319 09:27:04.675086 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 09:27:04.689018 master-0 kubenswrapper[13205]: I0319 09:27:04.688978 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 09:27:04.778669 master-0 kubenswrapper[13205]: I0319 09:27:04.778632 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:27:04.797161 master-0 kubenswrapper[13205]: I0319 09:27:04.797106 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:27:04.835943 master-0 kubenswrapper[13205]: I0319 09:27:04.835900 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-4wm5n" Mar 19 09:27:04.840696 master-0 kubenswrapper[13205]: I0319 09:27:04.840672 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:27:04.847445 master-0 kubenswrapper[13205]: I0319 09:27:04.847394 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:27:04.849588 master-0 kubenswrapper[13205]: I0319 09:27:04.849558 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 09:27:04.851104 master-0 kubenswrapper[13205]: I0319 09:27:04.851076 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:27:04.857832 master-0 kubenswrapper[13205]: I0319 09:27:04.857800 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 09:27:04.870758 master-0 kubenswrapper[13205]: I0319 09:27:04.870579 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 09:27:05.027102 master-0 kubenswrapper[13205]: I0319 09:27:05.027028 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:27:05.060224 master-0 kubenswrapper[13205]: I0319 09:27:05.060174 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 09:27:05.134957 master-0 kubenswrapper[13205]: I0319 09:27:05.134891 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:27:05.141288 master-0 kubenswrapper[13205]: I0319 09:27:05.141239 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 09:27:05.171191 master-0 kubenswrapper[13205]: I0319 09:27:05.171156 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:27:05.237567 master-0 kubenswrapper[13205]: I0319 09:27:05.237465 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:27:05.294815 master-0 kubenswrapper[13205]: I0319 09:27:05.294693 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:27:05.332782 master-0 kubenswrapper[13205]: I0319 09:27:05.323677 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:27:05.332782 master-0 kubenswrapper[13205]: I0319 09:27:05.325014 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 09:27:05.408952 master-0 kubenswrapper[13205]: I0319 09:27:05.408886 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 09:27:05.441331 master-0 kubenswrapper[13205]: I0319 09:27:05.441273 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-bxhvs" Mar 19 09:27:05.474300 master-0 kubenswrapper[13205]: I0319 09:27:05.474231 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:27:05.500443 master-0 kubenswrapper[13205]: I0319 09:27:05.500375 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:27:05.643778 master-0 kubenswrapper[13205]: I0319 09:27:05.642905 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 09:27:05.664638 master-0 kubenswrapper[13205]: I0319 09:27:05.664587 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 09:27:05.686663 master-0 kubenswrapper[13205]: I0319 09:27:05.686617 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:27:05.692304 master-0 kubenswrapper[13205]: I0319 09:27:05.692227 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:27:05.736859 master-0 kubenswrapper[13205]: I0319 09:27:05.736813 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:27:05.742318 master-0 kubenswrapper[13205]: I0319 09:27:05.742294 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 09:27:05.757024 master-0 kubenswrapper[13205]: I0319 09:27:05.756970 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:27:05.785788 master-0 kubenswrapper[13205]: I0319 09:27:05.785745 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:27:05.885727 master-0 kubenswrapper[13205]: I0319 09:27:05.885660 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 09:27:06.078771 master-0 kubenswrapper[13205]: I0319 09:27:06.078708 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:27:06.110082 master-0 kubenswrapper[13205]: I0319 09:27:06.109978 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 09:27:06.124375 master-0 kubenswrapper[13205]: I0319 09:27:06.124290 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 09:27:06.171960 master-0 kubenswrapper[13205]: I0319 09:27:06.171878 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:27:06.257458 master-0 kubenswrapper[13205]: I0319 09:27:06.257381 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 09:27:06.266376 master-0 kubenswrapper[13205]: I0319 09:27:06.266329 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:27:06.273852 master-0 kubenswrapper[13205]: I0319 09:27:06.273803 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 09:27:06.283413 master-0 kubenswrapper[13205]: I0319 09:27:06.283321 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:27:06.318927 master-0 kubenswrapper[13205]: I0319 09:27:06.318871 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:27:06.330128 master-0 kubenswrapper[13205]: I0319 09:27:06.330032 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 09:27:06.362195 master-0 kubenswrapper[13205]: I0319 09:27:06.362133 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 09:27:06.376104 master-0 kubenswrapper[13205]: I0319 09:27:06.376054 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 09:27:06.453269 master-0 kubenswrapper[13205]: I0319 09:27:06.453219 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 09:27:06.454047 master-0 kubenswrapper[13205]: I0319 09:27:06.453983 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 09:27:06.480326 master-0 kubenswrapper[13205]: I0319 09:27:06.480252 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=17.480224649 podStartE2EDuration="17.480224649s" podCreationTimestamp="2026-03-19 09:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:06.471159734 +0000 UTC m=+211.803466662" watchObservedRunningTime="2026-03-19 09:27:06.480224649 +0000 UTC m=+211.812531537" Mar 19 09:27:06.519681 master-0 kubenswrapper[13205]: I0319 09:27:06.519636 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:27:06.603299 master-0 kubenswrapper[13205]: I0319 09:27:06.603164 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:27:06.627425 master-0 kubenswrapper[13205]: I0319 09:27:06.627366 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:27:06.691339 master-0 kubenswrapper[13205]: I0319 09:27:06.691280 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:27:06.711652 master-0 kubenswrapper[13205]: I0319 09:27:06.711579 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-khk7h" Mar 19 09:27:06.727829 master-0 kubenswrapper[13205]: I0319 09:27:06.727780 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-zbzf5" Mar 19 09:27:06.809195 master-0 kubenswrapper[13205]: I0319 09:27:06.809124 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-xnqzt" Mar 19 09:27:06.941964 master-0 kubenswrapper[13205]: I0319 09:27:06.941811 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 09:27:07.021473 master-0 kubenswrapper[13205]: I0319 09:27:07.021420 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 09:27:07.023002 master-0 kubenswrapper[13205]: I0319 09:27:07.022962 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 09:27:07.031649 master-0 kubenswrapper[13205]: I0319 09:27:07.031590 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-2svn2" Mar 19 09:27:07.055294 master-0 kubenswrapper[13205]: I0319 09:27:07.055195 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-99v25" Mar 19 09:27:07.058614 master-0 kubenswrapper[13205]: I0319 09:27:07.058496 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:27:07.059105 master-0 kubenswrapper[13205]: I0319 09:27:07.059071 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:27:07.119654 master-0 kubenswrapper[13205]: I0319 09:27:07.119612 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 09:27:07.148111 master-0 kubenswrapper[13205]: I0319 09:27:07.148061 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 09:27:07.156019 master-0 kubenswrapper[13205]: I0319 09:27:07.155961 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:27:07.180671 master-0 kubenswrapper[13205]: I0319 09:27:07.180432 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-tv2z8" Mar 19 09:27:07.234025 master-0 kubenswrapper[13205]: I0319 09:27:07.232497 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 09:27:07.283237 master-0 kubenswrapper[13205]: I0319 09:27:07.283161 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:27:07.297680 master-0 kubenswrapper[13205]: I0319 09:27:07.297642 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:27:07.348422 master-0 kubenswrapper[13205]: I0319 09:27:07.348363 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:27:07.362498 master-0 kubenswrapper[13205]: I0319 09:27:07.362446 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 09:27:07.374968 master-0 kubenswrapper[13205]: I0319 09:27:07.374906 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 09:27:07.375200 master-0 kubenswrapper[13205]: I0319 09:27:07.374988 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:27:07.375200 master-0 kubenswrapper[13205]: I0319 09:27:07.375002 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:27:07.386408 master-0 kubenswrapper[13205]: I0319 09:27:07.386368 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 09:27:07.395042 master-0 kubenswrapper[13205]: I0319 09:27:07.395021 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:27:07.453171 master-0 kubenswrapper[13205]: I0319 09:27:07.453106 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-b9dtc" Mar 19 09:27:07.473794 master-0 kubenswrapper[13205]: I0319 09:27:07.473696 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:27:07.477677 master-0 kubenswrapper[13205]: I0319 09:27:07.477627 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:27:07.492084 master-0 kubenswrapper[13205]: I0319 09:27:07.491960 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:27:07.587543 master-0 kubenswrapper[13205]: I0319 09:27:07.587463 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-jvtc2" Mar 19 09:27:07.675452 master-0 kubenswrapper[13205]: I0319 09:27:07.669369 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 09:27:07.690691 master-0 kubenswrapper[13205]: I0319 09:27:07.690638 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:27:07.783491 master-0 kubenswrapper[13205]: I0319 09:27:07.783444 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-252nv" Mar 19 09:27:07.829493 master-0 kubenswrapper[13205]: I0319 09:27:07.829435 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 09:27:07.875071 master-0 kubenswrapper[13205]: I0319 09:27:07.875030 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 09:27:07.996941 master-0 kubenswrapper[13205]: I0319 09:27:07.996906 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 09:27:08.039650 master-0 kubenswrapper[13205]: I0319 09:27:08.039566 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:27:08.120773 master-0 kubenswrapper[13205]: I0319 09:27:08.120717 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:27:08.131438 master-0 kubenswrapper[13205]: I0319 09:27:08.131396 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 09:27:08.217412 master-0 kubenswrapper[13205]: I0319 09:27:08.217328 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 09:27:08.224651 master-0 kubenswrapper[13205]: I0319 09:27:08.224593 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 09:27:08.228213 master-0 kubenswrapper[13205]: I0319 09:27:08.228151 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-p8jxl" Mar 19 09:27:08.228377 master-0 kubenswrapper[13205]: I0319 09:27:08.228259 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:27:08.241425 master-0 kubenswrapper[13205]: I0319 09:27:08.241345 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 09:27:08.276905 master-0 kubenswrapper[13205]: I0319 09:27:08.276823 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:27:08.386876 master-0 kubenswrapper[13205]: I0319 09:27:08.386759 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 09:27:08.389081 master-0 kubenswrapper[13205]: I0319 09:27:08.389037 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:27:08.402647 master-0 kubenswrapper[13205]: I0319 09:27:08.402596 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:27:08.494655 master-0 kubenswrapper[13205]: I0319 09:27:08.494603 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:27:08.510381 master-0 kubenswrapper[13205]: I0319 09:27:08.510323 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 09:27:08.518733 master-0 kubenswrapper[13205]: I0319 09:27:08.518688 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 09:27:08.546687 master-0 kubenswrapper[13205]: I0319 09:27:08.546655 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 09:27:08.551973 master-0 kubenswrapper[13205]: I0319 09:27:08.551953 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:27:08.557754 master-0 kubenswrapper[13205]: I0319 09:27:08.557656 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 09:27:08.614799 master-0 kubenswrapper[13205]: I0319 09:27:08.614746 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:27:08.761411 master-0 kubenswrapper[13205]: I0319 09:27:08.761256 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-6j7vofh1gbciq" Mar 19 09:27:08.778980 master-0 kubenswrapper[13205]: I0319 09:27:08.778347 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:27:08.953976 master-0 kubenswrapper[13205]: I0319 09:27:08.953838 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:27:08.972808 master-0 kubenswrapper[13205]: I0319 09:27:08.972740 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:27:09.008382 master-0 kubenswrapper[13205]: I0319 09:27:09.008316 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:27:09.175460 master-0 kubenswrapper[13205]: I0319 09:27:09.175420 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:27:09.241147 master-0 kubenswrapper[13205]: I0319 09:27:09.241082 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 09:27:09.258553 master-0 kubenswrapper[13205]: I0319 09:27:09.258467 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 09:27:09.258861 master-0 kubenswrapper[13205]: I0319 09:27:09.258815 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 09:27:09.279853 master-0 kubenswrapper[13205]: I0319 09:27:09.279819 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 09:27:09.310972 master-0 kubenswrapper[13205]: I0319 09:27:09.310910 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:27:09.342675 master-0 kubenswrapper[13205]: I0319 09:27:09.342612 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:27:09.385731 master-0 kubenswrapper[13205]: I0319 09:27:09.385693 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:27:09.528681 master-0 kubenswrapper[13205]: I0319 09:27:09.528611 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:27:09.529401 master-0 kubenswrapper[13205]: I0319 09:27:09.528861 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:27:09.551466 master-0 kubenswrapper[13205]: I0319 09:27:09.551431 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-avpd2mlhiq4t" Mar 19 09:27:09.588444 master-0 kubenswrapper[13205]: I0319 09:27:09.588376 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 09:27:09.713187 master-0 kubenswrapper[13205]: I0319 09:27:09.713130 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:27:09.720907 master-0 kubenswrapper[13205]: I0319 09:27:09.720829 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:27:09.731673 master-0 kubenswrapper[13205]: I0319 09:27:09.731548 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-fmrxp" Mar 19 09:27:09.746503 master-0 kubenswrapper[13205]: I0319 09:27:09.746450 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 09:27:09.773225 master-0 kubenswrapper[13205]: I0319 09:27:09.773166 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 09:27:09.787425 master-0 kubenswrapper[13205]: I0319 09:27:09.787335 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 09:27:09.795311 master-0 kubenswrapper[13205]: I0319 09:27:09.795281 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-r2bsk" Mar 19 09:27:09.875891 master-0 kubenswrapper[13205]: I0319 09:27:09.875859 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:27:09.916036 master-0 kubenswrapper[13205]: I0319 09:27:09.915987 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 09:27:10.003643 master-0 kubenswrapper[13205]: I0319 09:27:10.001553 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:27:10.007303 master-0 kubenswrapper[13205]: I0319 09:27:10.007256 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:27:10.029425 master-0 kubenswrapper[13205]: I0319 09:27:10.029068 13205 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:27:10.118399 master-0 kubenswrapper[13205]: I0319 09:27:10.118277 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:27:10.207224 master-0 kubenswrapper[13205]: I0319 09:27:10.207154 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-d89qv" Mar 19 09:27:10.263968 master-0 kubenswrapper[13205]: I0319 09:27:10.263899 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:27:10.348428 master-0 kubenswrapper[13205]: I0319 09:27:10.348376 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 09:27:10.370840 master-0 kubenswrapper[13205]: I0319 09:27:10.370708 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 09:27:10.438806 master-0 kubenswrapper[13205]: I0319 09:27:10.438751 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 09:27:10.564050 master-0 kubenswrapper[13205]: I0319 09:27:10.563909 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 09:27:10.660713 master-0 kubenswrapper[13205]: I0319 09:27:10.660502 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-fdmh2" Mar 19 09:27:10.779380 master-0 kubenswrapper[13205]: I0319 09:27:10.779318 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:27:10.780589 master-0 kubenswrapper[13205]: I0319 09:27:10.780563 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 09:27:10.812081 master-0 kubenswrapper[13205]: I0319 09:27:10.812013 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:27:10.926879 master-0 kubenswrapper[13205]: I0319 09:27:10.924964 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 09:27:10.953600 master-0 kubenswrapper[13205]: I0319 09:27:10.953322 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 09:27:10.956512 master-0 kubenswrapper[13205]: I0319 09:27:10.956453 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 09:27:11.038473 master-0 kubenswrapper[13205]: I0319 09:27:11.038342 13205 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:27:11.164936 master-0 kubenswrapper[13205]: I0319 09:27:11.164857 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:27:11.179462 master-0 kubenswrapper[13205]: I0319 09:27:11.179309 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 09:27:11.198681 master-0 kubenswrapper[13205]: I0319 09:27:11.198629 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 09:27:11.248665 master-0 kubenswrapper[13205]: I0319 09:27:11.248589 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 09:27:11.267347 master-0 kubenswrapper[13205]: I0319 09:27:11.267290 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 09:27:11.283648 master-0 kubenswrapper[13205]: I0319 09:27:11.283515 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 09:27:11.349993 master-0 kubenswrapper[13205]: I0319 09:27:11.349905 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-lpljf" Mar 19 09:27:11.402241 master-0 kubenswrapper[13205]: I0319 09:27:11.402179 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 09:27:11.438501 master-0 kubenswrapper[13205]: I0319 09:27:11.438324 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 09:27:11.463990 master-0 kubenswrapper[13205]: I0319 09:27:11.463945 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:27:11.535959 master-0 kubenswrapper[13205]: I0319 09:27:11.535849 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:27:11.650916 master-0 kubenswrapper[13205]: I0319 09:27:11.650856 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:27:11.803314 master-0 kubenswrapper[13205]: I0319 09:27:11.803250 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 09:27:11.932611 master-0 kubenswrapper[13205]: I0319 09:27:11.932560 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 09:27:11.999292 master-0 kubenswrapper[13205]: I0319 09:27:11.999191 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:27:12.017885 master-0 kubenswrapper[13205]: I0319 09:27:12.017829 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 09:27:12.061387 master-0 kubenswrapper[13205]: I0319 09:27:12.061246 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 09:27:12.133202 master-0 kubenswrapper[13205]: I0319 09:27:12.133132 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-qttf4" Mar 19 09:27:12.175052 master-0 kubenswrapper[13205]: I0319 09:27:12.174987 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:27:12.273630 master-0 kubenswrapper[13205]: I0319 09:27:12.273593 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:27:12.273859 master-0 kubenswrapper[13205]: I0319 09:27:12.273798 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" containerID="cri-o://21f8e3fb8b8f4a721bfc75169c91fa6e54e02924bb118e4fafe0a8bc341e366c" gracePeriod=5 Mar 19 09:27:12.318119 master-0 kubenswrapper[13205]: I0319 09:27:12.318027 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 09:27:12.318674 master-0 kubenswrapper[13205]: I0319 09:27:12.318640 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-s9ktx" Mar 19 09:27:12.377777 master-0 kubenswrapper[13205]: I0319 09:27:12.377733 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:27:12.382123 master-0 kubenswrapper[13205]: I0319 09:27:12.382074 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 09:27:12.383152 master-0 kubenswrapper[13205]: I0319 09:27:12.383131 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:27:12.437335 master-0 kubenswrapper[13205]: I0319 09:27:12.437284 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 09:27:12.464464 master-0 kubenswrapper[13205]: I0319 09:27:12.464414 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 09:27:12.546461 master-0 kubenswrapper[13205]: I0319 09:27:12.546397 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 09:27:12.599958 master-0 kubenswrapper[13205]: I0319 09:27:12.599828 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 09:27:12.699229 master-0 kubenswrapper[13205]: I0319 09:27:12.699164 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 09:27:12.744028 master-0 kubenswrapper[13205]: I0319 09:27:12.743963 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 09:27:12.860255 master-0 kubenswrapper[13205]: I0319 09:27:12.860135 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 09:27:12.916571 master-0 kubenswrapper[13205]: I0319 09:27:12.916491 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-wfcn9" Mar 19 09:27:13.058552 master-0 kubenswrapper[13205]: I0319 09:27:13.058491 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 09:27:13.225374 master-0 kubenswrapper[13205]: I0319 09:27:13.225266 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 09:27:13.286794 master-0 kubenswrapper[13205]: I0319 09:27:13.286742 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 09:27:13.354707 master-0 kubenswrapper[13205]: I0319 09:27:13.354656 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-qn2w4" Mar 19 09:27:13.398510 master-0 kubenswrapper[13205]: I0319 09:27:13.398455 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-rcdtx" Mar 19 09:27:13.413948 master-0 kubenswrapper[13205]: I0319 09:27:13.413909 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 09:27:13.544027 master-0 kubenswrapper[13205]: I0319 09:27:13.543947 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-2rqbc" Mar 19 09:27:13.578667 master-0 kubenswrapper[13205]: I0319 09:27:13.578602 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:27:13.620126 master-0 kubenswrapper[13205]: I0319 09:27:13.620081 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-7dv6h" Mar 19 09:27:13.786992 master-0 kubenswrapper[13205]: I0319 09:27:13.786940 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 09:27:13.924444 master-0 kubenswrapper[13205]: I0319 09:27:13.924292 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:27:13.985475 master-0 kubenswrapper[13205]: I0319 09:27:13.985377 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-92z97" Mar 19 09:27:14.013092 master-0 kubenswrapper[13205]: I0319 09:27:14.013025 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 09:27:14.013341 master-0 kubenswrapper[13205]: I0319 09:27:14.013105 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 09:27:14.042304 master-0 kubenswrapper[13205]: I0319 09:27:14.042220 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:27:14.100714 master-0 kubenswrapper[13205]: I0319 09:27:14.100633 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 09:27:14.111627 master-0 kubenswrapper[13205]: I0319 09:27:14.107211 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 09:27:14.172250 master-0 kubenswrapper[13205]: I0319 09:27:14.172185 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 09:27:14.288627 master-0 kubenswrapper[13205]: I0319 09:27:14.288512 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:27:14.370937 master-0 kubenswrapper[13205]: I0319 09:27:14.370879 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:27:14.448283 master-0 kubenswrapper[13205]: I0319 09:27:14.448240 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 09:27:14.457897 master-0 kubenswrapper[13205]: I0319 09:27:14.457853 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:27:15.300331 master-0 kubenswrapper[13205]: I0319 09:27:15.300219 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 09:27:17.864238 master-0 kubenswrapper[13205]: I0319 09:27:17.864160 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_8e7a82869988463543d3d8dd1f0b5fe3/startup-monitor/0.log" Mar 19 09:27:17.864238 master-0 kubenswrapper[13205]: I0319 09:27:17.864242 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:27:18.017493 master-0 kubenswrapper[13205]: I0319 09:27:18.017413 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 09:27:18.017493 master-0 kubenswrapper[13205]: I0319 09:27:18.017448 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 09:27:18.017493 master-0 kubenswrapper[13205]: I0319 09:27:18.017466 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 09:27:18.017992 master-0 kubenswrapper[13205]: I0319 09:27:18.017519 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 09:27:18.017992 master-0 kubenswrapper[13205]: I0319 09:27:18.017575 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock" (OuterVolumeSpecName: "var-lock") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:27:18.017992 master-0 kubenswrapper[13205]: I0319 09:27:18.017630 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 09:27:18.017992 master-0 kubenswrapper[13205]: I0319 09:27:18.017629 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log" (OuterVolumeSpecName: "var-log") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:27:18.017992 master-0 kubenswrapper[13205]: I0319 09:27:18.017761 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:27:18.017992 master-0 kubenswrapper[13205]: I0319 09:27:18.017806 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests" (OuterVolumeSpecName: "manifests") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:27:18.018447 master-0 kubenswrapper[13205]: I0319 09:27:18.018219 13205 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:18.018447 master-0 kubenswrapper[13205]: I0319 09:27:18.018241 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:18.018447 master-0 kubenswrapper[13205]: I0319 09:27:18.018253 13205 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:18.018447 master-0 kubenswrapper[13205]: I0319 09:27:18.018264 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:18.023330 master-0 kubenswrapper[13205]: I0319 09:27:18.023247 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:27:18.119262 master-0 kubenswrapper[13205]: I0319 09:27:18.119128 13205 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:18.196860 master-0 kubenswrapper[13205]: I0319 09:27:18.196789 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_8e7a82869988463543d3d8dd1f0b5fe3/startup-monitor/0.log" Mar 19 09:27:18.197111 master-0 kubenswrapper[13205]: I0319 09:27:18.196876 13205 generic.go:334] "Generic (PLEG): container finished" podID="8e7a82869988463543d3d8dd1f0b5fe3" containerID="21f8e3fb8b8f4a721bfc75169c91fa6e54e02924bb118e4fafe0a8bc341e366c" exitCode=137 Mar 19 09:27:18.197111 master-0 kubenswrapper[13205]: I0319 09:27:18.196934 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:27:18.197111 master-0 kubenswrapper[13205]: I0319 09:27:18.196937 13205 scope.go:117] "RemoveContainer" containerID="21f8e3fb8b8f4a721bfc75169c91fa6e54e02924bb118e4fafe0a8bc341e366c" Mar 19 09:27:18.223546 master-0 kubenswrapper[13205]: I0319 09:27:18.223491 13205 scope.go:117] "RemoveContainer" containerID="21f8e3fb8b8f4a721bfc75169c91fa6e54e02924bb118e4fafe0a8bc341e366c" Mar 19 09:27:18.224013 master-0 kubenswrapper[13205]: E0319 09:27:18.223970 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21f8e3fb8b8f4a721bfc75169c91fa6e54e02924bb118e4fafe0a8bc341e366c\": container with ID starting with 21f8e3fb8b8f4a721bfc75169c91fa6e54e02924bb118e4fafe0a8bc341e366c not found: ID does not exist" containerID="21f8e3fb8b8f4a721bfc75169c91fa6e54e02924bb118e4fafe0a8bc341e366c" Mar 19 09:27:18.224076 master-0 kubenswrapper[13205]: I0319 09:27:18.224022 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21f8e3fb8b8f4a721bfc75169c91fa6e54e02924bb118e4fafe0a8bc341e366c"} err="failed to get container status \"21f8e3fb8b8f4a721bfc75169c91fa6e54e02924bb118e4fafe0a8bc341e366c\": rpc error: code = NotFound desc = could not find container \"21f8e3fb8b8f4a721bfc75169c91fa6e54e02924bb118e4fafe0a8bc341e366c\": container with ID starting with 21f8e3fb8b8f4a721bfc75169c91fa6e54e02924bb118e4fafe0a8bc341e366c not found: ID does not exist" Mar 19 09:27:18.861832 master-0 kubenswrapper[13205]: I0319 09:27:18.861711 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e7a82869988463543d3d8dd1f0b5fe3" path="/var/lib/kubelet/pods/8e7a82869988463543d3d8dd1f0b5fe3/volumes" Mar 19 09:27:24.018878 master-0 kubenswrapper[13205]: I0319 09:27:24.018788 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:27:24.024391 master-0 kubenswrapper[13205]: I0319 09:27:24.024268 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:27:31.789674 master-0 kubenswrapper[13205]: I0319 09:27:31.789584 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:27:35.213364 master-0 kubenswrapper[13205]: E0319 09:27:35.213240 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:27:40.355873 master-0 kubenswrapper[13205]: I0319 09:27:40.355803 13205 scope.go:117] "RemoveContainer" containerID="a80a075ae2d2bfe0e545df390d9ff0ad18516cad1ed3ad4a716e570d8e5f21c1" Mar 19 09:27:41.814293 master-0 kubenswrapper[13205]: I0319 09:27:41.814234 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:27:44.300133 master-0 kubenswrapper[13205]: I0319 09:27:44.300088 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-gpc6r" Mar 19 09:27:49.262470 master-0 kubenswrapper[13205]: I0319 09:27:49.262402 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 09:27:56.013910 master-0 kubenswrapper[13205]: I0319 09:27:56.013850 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 19 09:27:56.014451 master-0 kubenswrapper[13205]: E0319 09:27:56.014144 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" Mar 19 09:27:56.014451 master-0 kubenswrapper[13205]: I0319 09:27:56.014157 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" Mar 19 09:27:56.014451 master-0 kubenswrapper[13205]: E0319 09:27:56.014173 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" containerName="installer" Mar 19 09:27:56.014451 master-0 kubenswrapper[13205]: I0319 09:27:56.014179 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" containerName="installer" Mar 19 09:27:56.014451 master-0 kubenswrapper[13205]: I0319 09:27:56.014307 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" Mar 19 09:27:56.014451 master-0 kubenswrapper[13205]: I0319 09:27:56.014322 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f12c099-d9a7-48a9-9965-c339c4e32d31" containerName="installer" Mar 19 09:27:56.014814 master-0 kubenswrapper[13205]: I0319 09:27:56.014756 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 09:27:56.017740 master-0 kubenswrapper[13205]: I0319 09:27:56.017696 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-5sn8v" Mar 19 09:27:56.017836 master-0 kubenswrapper[13205]: I0319 09:27:56.017778 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 19 09:27:56.112633 master-0 kubenswrapper[13205]: I0319 09:27:56.112578 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 19 09:27:56.169978 master-0 kubenswrapper[13205]: I0319 09:27:56.169924 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b149c739-203d-4f5a-af11-dba6835ed71d-kube-api-access\") pod \"installer-2-master-0\" (UID: \"b149c739-203d-4f5a-af11-dba6835ed71d\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:27:56.170315 master-0 kubenswrapper[13205]: I0319 09:27:56.170295 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b149c739-203d-4f5a-af11-dba6835ed71d-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"b149c739-203d-4f5a-af11-dba6835ed71d\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:27:56.170457 master-0 kubenswrapper[13205]: I0319 09:27:56.170440 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b149c739-203d-4f5a-af11-dba6835ed71d-var-lock\") pod \"installer-2-master-0\" (UID: \"b149c739-203d-4f5a-af11-dba6835ed71d\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:27:56.271660 master-0 kubenswrapper[13205]: I0319 09:27:56.271508 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b149c739-203d-4f5a-af11-dba6835ed71d-kube-api-access\") pod \"installer-2-master-0\" (UID: \"b149c739-203d-4f5a-af11-dba6835ed71d\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:27:56.271858 master-0 kubenswrapper[13205]: I0319 09:27:56.271711 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b149c739-203d-4f5a-af11-dba6835ed71d-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"b149c739-203d-4f5a-af11-dba6835ed71d\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:27:56.271900 master-0 kubenswrapper[13205]: I0319 09:27:56.271831 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b149c739-203d-4f5a-af11-dba6835ed71d-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"b149c739-203d-4f5a-af11-dba6835ed71d\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:27:56.271976 master-0 kubenswrapper[13205]: I0319 09:27:56.271930 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b149c739-203d-4f5a-af11-dba6835ed71d-var-lock\") pod \"installer-2-master-0\" (UID: \"b149c739-203d-4f5a-af11-dba6835ed71d\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:27:56.272083 master-0 kubenswrapper[13205]: I0319 09:27:56.272053 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b149c739-203d-4f5a-af11-dba6835ed71d-var-lock\") pod \"installer-2-master-0\" (UID: \"b149c739-203d-4f5a-af11-dba6835ed71d\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:27:56.557228 master-0 kubenswrapper[13205]: I0319 09:27:56.557099 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b149c739-203d-4f5a-af11-dba6835ed71d-kube-api-access\") pod \"installer-2-master-0\" (UID: \"b149c739-203d-4f5a-af11-dba6835ed71d\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:27:56.638925 master-0 kubenswrapper[13205]: I0319 09:27:56.638859 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 09:27:57.430289 master-0 kubenswrapper[13205]: I0319 09:27:57.430193 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 19 09:27:57.439508 master-0 kubenswrapper[13205]: W0319 09:27:57.439401 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb149c739_203d_4f5a_af11_dba6835ed71d.slice/crio-d12e946845cda996b442f035801c35ad1cfcaa3e18a6b833bae46e4013d50906 WatchSource:0}: Error finding container d12e946845cda996b442f035801c35ad1cfcaa3e18a6b833bae46e4013d50906: Status 404 returned error can't find the container with id d12e946845cda996b442f035801c35ad1cfcaa3e18a6b833bae46e4013d50906 Mar 19 09:27:57.465140 master-0 kubenswrapper[13205]: I0319 09:27:57.465072 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"b149c739-203d-4f5a-af11-dba6835ed71d","Type":"ContainerStarted","Data":"d12e946845cda996b442f035801c35ad1cfcaa3e18a6b833bae46e4013d50906"} Mar 19 09:27:58.471746 master-0 kubenswrapper[13205]: I0319 09:27:58.471626 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"b149c739-203d-4f5a-af11-dba6835ed71d","Type":"ContainerStarted","Data":"bf14a49c61279170cec510e38fbb1a248535b54a84660aba24b506922fec9fa1"} Mar 19 09:27:58.815561 master-0 kubenswrapper[13205]: I0319 09:27:58.815429 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=3.815402947 podStartE2EDuration="3.815402947s" podCreationTimestamp="2026-03-19 09:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:58.808834123 +0000 UTC m=+264.141141021" watchObservedRunningTime="2026-03-19 09:27:58.815402947 +0000 UTC m=+264.147709835" Mar 19 09:28:15.658649 master-0 kubenswrapper[13205]: I0319 09:28:15.655329 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-fd57cd489-6jmpf"] Mar 19 09:28:15.658649 master-0 kubenswrapper[13205]: I0319 09:28:15.657210 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.670487 master-0 kubenswrapper[13205]: I0319 09:28:15.660564 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 09:28:15.670487 master-0 kubenswrapper[13205]: I0319 09:28:15.660797 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 09:28:15.670487 master-0 kubenswrapper[13205]: I0319 09:28:15.660832 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-hnn89" Mar 19 09:28:15.670487 master-0 kubenswrapper[13205]: I0319 09:28:15.661213 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 09:28:15.670487 master-0 kubenswrapper[13205]: I0319 09:28:15.661223 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 09:28:15.670487 master-0 kubenswrapper[13205]: I0319 09:28:15.662783 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 09:28:15.670487 master-0 kubenswrapper[13205]: I0319 09:28:15.663314 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 09:28:15.670487 master-0 kubenswrapper[13205]: I0319 09:28:15.663462 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 09:28:15.670487 master-0 kubenswrapper[13205]: I0319 09:28:15.663653 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 09:28:15.670487 master-0 kubenswrapper[13205]: I0319 09:28:15.663732 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 09:28:15.670487 master-0 kubenswrapper[13205]: I0319 09:28:15.663908 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 09:28:15.670487 master-0 kubenswrapper[13205]: I0319 09:28:15.664059 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 09:28:15.675585 master-0 kubenswrapper[13205]: I0319 09:28:15.675369 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 09:28:15.685652 master-0 kubenswrapper[13205]: I0319 09:28:15.682092 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 09:28:15.697216 master-0 kubenswrapper[13205]: I0319 09:28:15.697139 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-login\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.697445 master-0 kubenswrapper[13205]: I0319 09:28:15.697230 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.697445 master-0 kubenswrapper[13205]: I0319 09:28:15.697338 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-audit-policies\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.697445 master-0 kubenswrapper[13205]: I0319 09:28:15.697392 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.697607 master-0 kubenswrapper[13205]: I0319 09:28:15.697554 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-error\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.697682 master-0 kubenswrapper[13205]: I0319 09:28:15.697644 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.697775 master-0 kubenswrapper[13205]: I0319 09:28:15.697742 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-session\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.697812 master-0 kubenswrapper[13205]: I0319 09:28:15.697797 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.697846 master-0 kubenswrapper[13205]: I0319 09:28:15.697835 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-router-certs\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.697929 master-0 kubenswrapper[13205]: I0319 09:28:15.697890 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-audit-dir\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.697985 master-0 kubenswrapper[13205]: I0319 09:28:15.697964 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-service-ca\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.698023 master-0 kubenswrapper[13205]: I0319 09:28:15.698007 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.698324 master-0 kubenswrapper[13205]: I0319 09:28:15.698290 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xxcg\" (UniqueName: \"kubernetes.io/projected/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-kube-api-access-7xxcg\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.721164 master-0 kubenswrapper[13205]: I0319 09:28:15.717235 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-fd57cd489-6jmpf"] Mar 19 09:28:15.799949 master-0 kubenswrapper[13205]: I0319 09:28:15.799871 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xxcg\" (UniqueName: \"kubernetes.io/projected/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-kube-api-access-7xxcg\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.800175 master-0 kubenswrapper[13205]: I0319 09:28:15.800050 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-login\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.800175 master-0 kubenswrapper[13205]: I0319 09:28:15.800113 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.800175 master-0 kubenswrapper[13205]: I0319 09:28:15.800145 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-audit-policies\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.800272 master-0 kubenswrapper[13205]: I0319 09:28:15.800179 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-error\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.800272 master-0 kubenswrapper[13205]: I0319 09:28:15.800199 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.800272 master-0 kubenswrapper[13205]: I0319 09:28:15.800220 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.800368 master-0 kubenswrapper[13205]: I0319 09:28:15.800288 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-session\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.800368 master-0 kubenswrapper[13205]: I0319 09:28:15.800325 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.800368 master-0 kubenswrapper[13205]: I0319 09:28:15.800356 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-router-certs\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.800465 master-0 kubenswrapper[13205]: I0319 09:28:15.800379 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-audit-dir\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.800465 master-0 kubenswrapper[13205]: I0319 09:28:15.800429 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-service-ca\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.800550 master-0 kubenswrapper[13205]: I0319 09:28:15.800465 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.801002 master-0 kubenswrapper[13205]: I0319 09:28:15.800930 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-audit-dir\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.811345 master-0 kubenswrapper[13205]: E0319 09:28:15.801636 13205 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-router-certs: secret "v4-0-config-system-router-certs" not found Mar 19 09:28:15.811345 master-0 kubenswrapper[13205]: E0319 09:28:15.802007 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-router-certs podName:e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5 nodeName:}" failed. No retries permitted until 2026-03-19 09:28:16.301981143 +0000 UTC m=+281.634288031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-router-certs" (UniqueName: "kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-router-certs") pod "oauth-openshift-fd57cd489-6jmpf" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5") : secret "v4-0-config-system-router-certs" not found Mar 19 09:28:15.811345 master-0 kubenswrapper[13205]: I0319 09:28:15.802171 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-service-ca\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.811345 master-0 kubenswrapper[13205]: E0319 09:28:15.802433 13205 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 19 09:28:15.811345 master-0 kubenswrapper[13205]: I0319 09:28:15.802462 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.811345 master-0 kubenswrapper[13205]: E0319 09:28:15.802490 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig podName:e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5 nodeName:}" failed. No retries permitted until 2026-03-19 09:28:16.302475785 +0000 UTC m=+281.634782673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig") pod "oauth-openshift-fd57cd489-6jmpf" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5") : configmap "v4-0-config-system-cliconfig" not found Mar 19 09:28:15.811345 master-0 kubenswrapper[13205]: I0319 09:28:15.802725 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-audit-policies\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.817769 master-0 kubenswrapper[13205]: I0319 09:28:15.817718 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-error\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.818996 master-0 kubenswrapper[13205]: I0319 09:28:15.818940 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.819490 master-0 kubenswrapper[13205]: I0319 09:28:15.819439 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-session\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.819962 master-0 kubenswrapper[13205]: I0319 09:28:15.819914 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.820183 master-0 kubenswrapper[13205]: I0319 09:28:15.820093 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:15.820786 master-0 kubenswrapper[13205]: I0319 09:28:15.820717 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-login\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:16.298550 master-0 kubenswrapper[13205]: I0319 09:28:16.293893 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xxcg\" (UniqueName: \"kubernetes.io/projected/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-kube-api-access-7xxcg\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:16.309601 master-0 kubenswrapper[13205]: I0319 09:28:16.309502 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:16.309821 master-0 kubenswrapper[13205]: I0319 09:28:16.309627 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-router-certs\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:16.309908 master-0 kubenswrapper[13205]: E0319 09:28:16.309833 13205 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-router-certs: secret "v4-0-config-system-router-certs" not found Mar 19 09:28:16.309958 master-0 kubenswrapper[13205]: E0319 09:28:16.309917 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-router-certs podName:e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5 nodeName:}" failed. No retries permitted until 2026-03-19 09:28:17.309890336 +0000 UTC m=+282.642197224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-router-certs" (UniqueName: "kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-router-certs") pod "oauth-openshift-fd57cd489-6jmpf" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5") : secret "v4-0-config-system-router-certs" not found Mar 19 09:28:16.310445 master-0 kubenswrapper[13205]: E0319 09:28:16.310414 13205 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 19 09:28:16.310505 master-0 kubenswrapper[13205]: E0319 09:28:16.310457 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig podName:e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5 nodeName:}" failed. No retries permitted until 2026-03-19 09:28:17.31044665 +0000 UTC m=+282.642753538 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig") pod "oauth-openshift-fd57cd489-6jmpf" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5") : configmap "v4-0-config-system-cliconfig" not found Mar 19 09:28:17.323868 master-0 kubenswrapper[13205]: I0319 09:28:17.323775 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:17.324727 master-0 kubenswrapper[13205]: I0319 09:28:17.323953 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-router-certs\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:17.324727 master-0 kubenswrapper[13205]: E0319 09:28:17.323985 13205 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 19 09:28:17.324727 master-0 kubenswrapper[13205]: E0319 09:28:17.324098 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig podName:e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5 nodeName:}" failed. No retries permitted until 2026-03-19 09:28:19.324074733 +0000 UTC m=+284.656381671 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig") pod "oauth-openshift-fd57cd489-6jmpf" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5") : configmap "v4-0-config-system-cliconfig" not found Mar 19 09:28:17.327364 master-0 kubenswrapper[13205]: I0319 09:28:17.327322 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-router-certs\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:19.355889 master-0 kubenswrapper[13205]: I0319 09:28:19.355757 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:19.356855 master-0 kubenswrapper[13205]: E0319 09:28:19.355927 13205 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 19 09:28:19.356855 master-0 kubenswrapper[13205]: E0319 09:28:19.356034 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig podName:e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5 nodeName:}" failed. No retries permitted until 2026-03-19 09:28:23.356012943 +0000 UTC m=+288.688319841 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig") pod "oauth-openshift-fd57cd489-6jmpf" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5") : configmap "v4-0-config-system-cliconfig" not found Mar 19 09:28:23.417330 master-0 kubenswrapper[13205]: I0319 09:28:23.417184 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:23.418589 master-0 kubenswrapper[13205]: I0319 09:28:23.418492 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fd57cd489-6jmpf\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:23.492635 master-0 kubenswrapper[13205]: I0319 09:28:23.492478 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:24.596440 master-0 kubenswrapper[13205]: I0319 09:28:24.596350 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-fd57cd489-6jmpf"] Mar 19 09:28:24.829901 master-0 kubenswrapper[13205]: I0319 09:28:24.829841 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-fd57cd489-6jmpf"] Mar 19 09:28:24.845519 master-0 kubenswrapper[13205]: W0319 09:28:24.845439 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode199b2f1_71c1_40ff_b4f7_4bbbb66ed9f5.slice/crio-126ac4435cf417f6c704f4b02df9d702c9bd43bddbef0b99b8aeeb2f414a1645 WatchSource:0}: Error finding container 126ac4435cf417f6c704f4b02df9d702c9bd43bddbef0b99b8aeeb2f414a1645: Status 404 returned error can't find the container with id 126ac4435cf417f6c704f4b02df9d702c9bd43bddbef0b99b8aeeb2f414a1645 Mar 19 09:28:25.692577 master-0 kubenswrapper[13205]: I0319 09:28:25.692341 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" event={"ID":"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5","Type":"ContainerStarted","Data":"126ac4435cf417f6c704f4b02df9d702c9bd43bddbef0b99b8aeeb2f414a1645"} Mar 19 09:28:26.212971 master-0 kubenswrapper[13205]: I0319 09:28:26.212683 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-79f67cdc89-bx72w"] Mar 19 09:28:26.214302 master-0 kubenswrapper[13205]: I0319 09:28:26.214246 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.217879 master-0 kubenswrapper[13205]: I0319 09:28:26.217814 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 09:28:26.218404 master-0 kubenswrapper[13205]: I0319 09:28:26.218365 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 09:28:26.218946 master-0 kubenswrapper[13205]: I0319 09:28:26.218902 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 09:28:26.220594 master-0 kubenswrapper[13205]: I0319 09:28:26.220520 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 09:28:26.220968 master-0 kubenswrapper[13205]: I0319 09:28:26.220637 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 09:28:26.225714 master-0 kubenswrapper[13205]: I0319 09:28:26.225632 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-88lgr" Mar 19 09:28:26.235611 master-0 kubenswrapper[13205]: I0319 09:28:26.235500 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 09:28:26.259793 master-0 kubenswrapper[13205]: I0319 09:28:26.259734 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-oauth-config\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.260008 master-0 kubenswrapper[13205]: I0319 09:28:26.259836 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-service-ca\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.260008 master-0 kubenswrapper[13205]: I0319 09:28:26.259865 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-serving-cert\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.260008 master-0 kubenswrapper[13205]: I0319 09:28:26.259889 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-config\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.260008 master-0 kubenswrapper[13205]: I0319 09:28:26.259921 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-oauth-serving-cert\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.260170 master-0 kubenswrapper[13205]: I0319 09:28:26.260091 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5lv9\" (UniqueName: \"kubernetes.io/projected/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-kube-api-access-m5lv9\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.260247 master-0 kubenswrapper[13205]: I0319 09:28:26.260194 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-trusted-ca-bundle\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.361185 master-0 kubenswrapper[13205]: I0319 09:28:26.361101 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-oauth-config\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.361491 master-0 kubenswrapper[13205]: I0319 09:28:26.361363 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-service-ca\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.361491 master-0 kubenswrapper[13205]: I0319 09:28:26.361404 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-serving-cert\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.361491 master-0 kubenswrapper[13205]: I0319 09:28:26.361451 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-config\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.361759 master-0 kubenswrapper[13205]: I0319 09:28:26.361506 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-oauth-serving-cert\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.361759 master-0 kubenswrapper[13205]: I0319 09:28:26.361626 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-trusted-ca-bundle\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.361759 master-0 kubenswrapper[13205]: I0319 09:28:26.361662 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5lv9\" (UniqueName: \"kubernetes.io/projected/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-kube-api-access-m5lv9\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.363249 master-0 kubenswrapper[13205]: I0319 09:28:26.363205 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-service-ca\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.363465 master-0 kubenswrapper[13205]: I0319 09:28:26.363437 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-oauth-serving-cert\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.363680 master-0 kubenswrapper[13205]: I0319 09:28:26.363613 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-trusted-ca-bundle\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.364424 master-0 kubenswrapper[13205]: I0319 09:28:26.364286 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-config\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.367605 master-0 kubenswrapper[13205]: I0319 09:28:26.367506 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-serving-cert\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.367936 master-0 kubenswrapper[13205]: I0319 09:28:26.367875 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-oauth-config\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:26.969838 master-0 kubenswrapper[13205]: I0319 09:28:26.969584 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79f67cdc89-bx72w"] Mar 19 09:28:27.065437 master-0 kubenswrapper[13205]: I0319 09:28:27.065367 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5lv9\" (UniqueName: \"kubernetes.io/projected/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-kube-api-access-m5lv9\") pod \"console-79f67cdc89-bx72w\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:27.143896 master-0 kubenswrapper[13205]: I0319 09:28:27.143263 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:29.662521 master-0 kubenswrapper[13205]: I0319 09:28:29.662396 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79f67cdc89-bx72w"] Mar 19 09:28:30.142971 master-0 kubenswrapper[13205]: W0319 09:28:30.142917 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a8e5bd7_de13_4773_8a38_5edf4fda23fd.slice/crio-11181626b19d5fb500e0d964d41a94de1d31a7b757550c9a05f23f193dd72f08 WatchSource:0}: Error finding container 11181626b19d5fb500e0d964d41a94de1d31a7b757550c9a05f23f193dd72f08: Status 404 returned error can't find the container with id 11181626b19d5fb500e0d964d41a94de1d31a7b757550c9a05f23f193dd72f08 Mar 19 09:28:30.734610 master-0 kubenswrapper[13205]: I0319 09:28:30.734504 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79f67cdc89-bx72w" event={"ID":"5a8e5bd7-de13-4773-8a38-5edf4fda23fd","Type":"ContainerStarted","Data":"11181626b19d5fb500e0d964d41a94de1d31a7b757550c9a05f23f193dd72f08"} Mar 19 09:28:30.793594 master-0 kubenswrapper[13205]: I0319 09:28:30.793504 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 19 09:28:30.794343 master-0 kubenswrapper[13205]: I0319 09:28:30.794312 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:28:30.796320 master-0 kubenswrapper[13205]: I0319 09:28:30.796285 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-nm2j7" Mar 19 09:28:30.797712 master-0 kubenswrapper[13205]: I0319 09:28:30.797682 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 09:28:30.946561 master-0 kubenswrapper[13205]: I0319 09:28:30.944586 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-var-lock\") pod \"installer-2-master-0\" (UID: \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:28:30.946561 master-0 kubenswrapper[13205]: I0319 09:28:30.944640 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:28:30.946561 master-0 kubenswrapper[13205]: I0319 09:28:30.945850 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-kube-api-access\") pod \"installer-2-master-0\" (UID: \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:28:30.980794 master-0 kubenswrapper[13205]: I0319 09:28:30.980561 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 19 09:28:31.047623 master-0 kubenswrapper[13205]: I0319 09:28:31.047567 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-var-lock\") pod \"installer-2-master-0\" (UID: \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:28:31.047623 master-0 kubenswrapper[13205]: I0319 09:28:31.047621 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:28:31.047920 master-0 kubenswrapper[13205]: I0319 09:28:31.047686 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-kube-api-access\") pod \"installer-2-master-0\" (UID: \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:28:31.047920 master-0 kubenswrapper[13205]: I0319 09:28:31.047701 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-var-lock\") pod \"installer-2-master-0\" (UID: \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:28:31.047920 master-0 kubenswrapper[13205]: I0319 09:28:31.047828 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:28:31.393635 master-0 kubenswrapper[13205]: I0319 09:28:31.391140 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-kube-api-access\") pod \"installer-2-master-0\" (UID: \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:28:31.420100 master-0 kubenswrapper[13205]: I0319 09:28:31.420032 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:28:31.740763 master-0 kubenswrapper[13205]: I0319 09:28:31.740710 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c8fd866bf-g46sj"] Mar 19 09:28:31.741574 master-0 kubenswrapper[13205]: I0319 09:28:31.741542 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" podUID="1d5e311c-1c6a-4d5d-8c2b-493025593934" containerName="controller-manager" containerID="cri-o://53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378" gracePeriod=30 Mar 19 09:28:31.844066 master-0 kubenswrapper[13205]: I0319 09:28:31.843814 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr"] Mar 19 09:28:31.844284 master-0 kubenswrapper[13205]: I0319 09:28:31.844054 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" podUID="67d66357-fcee-4e70-b563-5895b978ab55" containerName="route-controller-manager" containerID="cri-o://6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194" gracePeriod=30 Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.857043 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.857396 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" containerID="cri-o://88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51" gracePeriod=30 Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.857546 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" containerID="cri-o://707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb" gracePeriod=30 Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.857578 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" containerID="cri-o://5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23" gracePeriod=30 Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.857629 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" containerID="cri-o://f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6" gracePeriod=30 Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.858053 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" containerID="cri-o://4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492" gracePeriod=30 Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.859934 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: E0319 09:28:31.860436 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860454 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: E0319 09:28:31.860472 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860480 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: E0319 09:28:31.860497 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860505 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: E0319 09:28:31.860518 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860542 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: E0319 09:28:31.860554 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860563 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: E0319 09:28:31.860580 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860588 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: E0319 09:28:31.860603 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860610 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: E0319 09:28:31.860621 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860628 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860789 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860803 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860816 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860823 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860832 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860843 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860861 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 19 09:28:31.862657 master-0 kubenswrapper[13205]: I0319 09:28:31.860872 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 19 09:28:32.077386 master-0 kubenswrapper[13205]: I0319 09:28:32.077304 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.077622 master-0 kubenswrapper[13205]: I0319 09:28:32.077420 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.077622 master-0 kubenswrapper[13205]: I0319 09:28:32.077457 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.077622 master-0 kubenswrapper[13205]: I0319 09:28:32.077585 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.077801 master-0 kubenswrapper[13205]: I0319 09:28:32.077642 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.077801 master-0 kubenswrapper[13205]: I0319 09:28:32.077665 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.154888 master-0 kubenswrapper[13205]: I0319 09:28:32.154795 13205 patch_prober.go:28] interesting pod/etcd-master-0 container/etcd namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.32.10:9980/readyz\": dial tcp 192.168.32.10:9980: connect: connection refused" start-of-body= Mar 19 09:28:32.154888 master-0 kubenswrapper[13205]: I0319 09:28:32.154858 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" probeResult="failure" output="Get \"https://192.168.32.10:9980/readyz\": dial tcp 192.168.32.10:9980: connect: connection refused" Mar 19 09:28:32.179590 master-0 kubenswrapper[13205]: I0319 09:28:32.179471 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.179590 master-0 kubenswrapper[13205]: I0319 09:28:32.179517 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.179921 master-0 kubenswrapper[13205]: I0319 09:28:32.179601 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.179921 master-0 kubenswrapper[13205]: I0319 09:28:32.179614 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.179921 master-0 kubenswrapper[13205]: I0319 09:28:32.179674 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.179921 master-0 kubenswrapper[13205]: I0319 09:28:32.179680 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.179921 master-0 kubenswrapper[13205]: I0319 09:28:32.179701 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.179921 master-0 kubenswrapper[13205]: I0319 09:28:32.179681 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.179921 master-0 kubenswrapper[13205]: I0319 09:28:32.179794 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.179921 master-0 kubenswrapper[13205]: I0319 09:28:32.179834 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.179921 master-0 kubenswrapper[13205]: I0319 09:28:32.179868 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.179921 master-0 kubenswrapper[13205]: I0319 09:28:32.179881 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:28:32.275871 master-0 kubenswrapper[13205]: I0319 09:28:32.275768 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:28:32.280775 master-0 kubenswrapper[13205]: I0319 09:28:32.280736 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d66357-fcee-4e70-b563-5895b978ab55-serving-cert\") pod \"67d66357-fcee-4e70-b563-5895b978ab55\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " Mar 19 09:28:32.280853 master-0 kubenswrapper[13205]: I0319 09:28:32.280793 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-config\") pod \"67d66357-fcee-4e70-b563-5895b978ab55\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " Mar 19 09:28:32.280853 master-0 kubenswrapper[13205]: I0319 09:28:32.280823 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sclqq\" (UniqueName: \"kubernetes.io/projected/67d66357-fcee-4e70-b563-5895b978ab55-kube-api-access-sclqq\") pod \"67d66357-fcee-4e70-b563-5895b978ab55\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " Mar 19 09:28:32.280853 master-0 kubenswrapper[13205]: I0319 09:28:32.280849 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-client-ca\") pod \"67d66357-fcee-4e70-b563-5895b978ab55\" (UID: \"67d66357-fcee-4e70-b563-5895b978ab55\") " Mar 19 09:28:32.281442 master-0 kubenswrapper[13205]: I0319 09:28:32.281397 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-client-ca" (OuterVolumeSpecName: "client-ca") pod "67d66357-fcee-4e70-b563-5895b978ab55" (UID: "67d66357-fcee-4e70-b563-5895b978ab55"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:32.281511 master-0 kubenswrapper[13205]: I0319 09:28:32.281477 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-config" (OuterVolumeSpecName: "config") pod "67d66357-fcee-4e70-b563-5895b978ab55" (UID: "67d66357-fcee-4e70-b563-5895b978ab55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:32.283509 master-0 kubenswrapper[13205]: I0319 09:28:32.283480 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d66357-fcee-4e70-b563-5895b978ab55-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67d66357-fcee-4e70-b563-5895b978ab55" (UID: "67d66357-fcee-4e70-b563-5895b978ab55"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:32.284607 master-0 kubenswrapper[13205]: I0319 09:28:32.284575 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d66357-fcee-4e70-b563-5895b978ab55-kube-api-access-sclqq" (OuterVolumeSpecName: "kube-api-access-sclqq") pod "67d66357-fcee-4e70-b563-5895b978ab55" (UID: "67d66357-fcee-4e70-b563-5895b978ab55"). InnerVolumeSpecName "kube-api-access-sclqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:28:32.381940 master-0 kubenswrapper[13205]: I0319 09:28:32.381870 13205 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d66357-fcee-4e70-b563-5895b978ab55-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:32.381940 master-0 kubenswrapper[13205]: I0319 09:28:32.381902 13205 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:32.381940 master-0 kubenswrapper[13205]: I0319 09:28:32.381911 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sclqq\" (UniqueName: \"kubernetes.io/projected/67d66357-fcee-4e70-b563-5895b978ab55-kube-api-access-sclqq\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:32.381940 master-0 kubenswrapper[13205]: I0319 09:28:32.381921 13205 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67d66357-fcee-4e70-b563-5895b978ab55-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:32.697255 master-0 kubenswrapper[13205]: I0319 09:28:32.697207 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:28:32.748164 master-0 kubenswrapper[13205]: I0319 09:28:32.748065 13205 generic.go:334] "Generic (PLEG): container finished" podID="1d5e311c-1c6a-4d5d-8c2b-493025593934" containerID="53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378" exitCode=0 Mar 19 09:28:32.748164 master-0 kubenswrapper[13205]: I0319 09:28:32.748132 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" event={"ID":"1d5e311c-1c6a-4d5d-8c2b-493025593934","Type":"ContainerDied","Data":"53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378"} Mar 19 09:28:32.748164 master-0 kubenswrapper[13205]: I0319 09:28:32.748158 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" event={"ID":"1d5e311c-1c6a-4d5d-8c2b-493025593934","Type":"ContainerDied","Data":"3eefb4e7e53cb0d4c4a18064ef7d910510d0608a89ca096908a4ffccd0aaebda"} Mar 19 09:28:32.748755 master-0 kubenswrapper[13205]: I0319 09:28:32.748175 13205 scope.go:117] "RemoveContainer" containerID="53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378" Mar 19 09:28:32.748755 master-0 kubenswrapper[13205]: I0319 09:28:32.748245 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" Mar 19 09:28:32.750974 master-0 kubenswrapper[13205]: I0319 09:28:32.750915 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 09:28:32.751734 master-0 kubenswrapper[13205]: I0319 09:28:32.751703 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 09:28:32.753297 master-0 kubenswrapper[13205]: I0319 09:28:32.753276 13205 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492" exitCode=2 Mar 19 09:28:32.753297 master-0 kubenswrapper[13205]: I0319 09:28:32.753297 13205 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6" exitCode=0 Mar 19 09:28:32.753441 master-0 kubenswrapper[13205]: I0319 09:28:32.753306 13205 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23" exitCode=2 Mar 19 09:28:32.755317 master-0 kubenswrapper[13205]: I0319 09:28:32.755156 13205 generic.go:334] "Generic (PLEG): container finished" podID="b149c739-203d-4f5a-af11-dba6835ed71d" containerID="bf14a49c61279170cec510e38fbb1a248535b54a84660aba24b506922fec9fa1" exitCode=0 Mar 19 09:28:32.755317 master-0 kubenswrapper[13205]: I0319 09:28:32.755194 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"b149c739-203d-4f5a-af11-dba6835ed71d","Type":"ContainerDied","Data":"bf14a49c61279170cec510e38fbb1a248535b54a84660aba24b506922fec9fa1"} Mar 19 09:28:32.759692 master-0 kubenswrapper[13205]: I0319 09:28:32.756308 13205 generic.go:334] "Generic (PLEG): container finished" podID="67d66357-fcee-4e70-b563-5895b978ab55" containerID="6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194" exitCode=0 Mar 19 09:28:32.759692 master-0 kubenswrapper[13205]: I0319 09:28:32.756352 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" event={"ID":"67d66357-fcee-4e70-b563-5895b978ab55","Type":"ContainerDied","Data":"6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194"} Mar 19 09:28:32.759692 master-0 kubenswrapper[13205]: I0319 09:28:32.756370 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" event={"ID":"67d66357-fcee-4e70-b563-5895b978ab55","Type":"ContainerDied","Data":"6bf8f167b730f8b123fa119481aeceac0bccae7e125576f133fb9531cd659c54"} Mar 19 09:28:32.759692 master-0 kubenswrapper[13205]: I0319 09:28:32.756411 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" Mar 19 09:28:32.761908 master-0 kubenswrapper[13205]: I0319 09:28:32.761745 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" event={"ID":"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5","Type":"ContainerStarted","Data":"e885c91e3a7232fbbf5adc2d6e560bf02206f0e76f4d1357f37135b4d633b27d"} Mar 19 09:28:32.762000 master-0 kubenswrapper[13205]: I0319 09:28:32.761944 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:28:32.772827 master-0 kubenswrapper[13205]: I0319 09:28:32.772776 13205 scope.go:117] "RemoveContainer" containerID="53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378" Mar 19 09:28:32.773429 master-0 kubenswrapper[13205]: E0319 09:28:32.773203 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378\": container with ID starting with 53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378 not found: ID does not exist" containerID="53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378" Mar 19 09:28:32.773429 master-0 kubenswrapper[13205]: I0319 09:28:32.773263 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378"} err="failed to get container status \"53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378\": rpc error: code = NotFound desc = could not find container \"53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378\": container with ID starting with 53e308ad3e920cf718db2669add347520e6914677c000144b9935ffb3a9ab378 not found: ID does not exist" Mar 19 09:28:32.773429 master-0 kubenswrapper[13205]: I0319 09:28:32.773292 13205 scope.go:117] "RemoveContainer" containerID="6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194" Mar 19 09:28:32.786418 master-0 kubenswrapper[13205]: I0319 09:28:32.786365 13205 scope.go:117] "RemoveContainer" containerID="6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194" Mar 19 09:28:32.786836 master-0 kubenswrapper[13205]: E0319 09:28:32.786797 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194\": container with ID starting with 6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194 not found: ID does not exist" containerID="6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194" Mar 19 09:28:32.786895 master-0 kubenswrapper[13205]: I0319 09:28:32.786846 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194"} err="failed to get container status \"6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194\": rpc error: code = NotFound desc = could not find container \"6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194\": container with ID starting with 6d9b6000085145f95cc299f0c5e0cc485ef648897ac48843b65836901b434194 not found: ID does not exist" Mar 19 09:28:32.889131 master-0 kubenswrapper[13205]: I0319 09:28:32.888820 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49fpz\" (UniqueName: \"kubernetes.io/projected/1d5e311c-1c6a-4d5d-8c2b-493025593934-kube-api-access-49fpz\") pod \"1d5e311c-1c6a-4d5d-8c2b-493025593934\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " Mar 19 09:28:32.889131 master-0 kubenswrapper[13205]: I0319 09:28:32.889041 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5e311c-1c6a-4d5d-8c2b-493025593934-serving-cert\") pod \"1d5e311c-1c6a-4d5d-8c2b-493025593934\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " Mar 19 09:28:32.889131 master-0 kubenswrapper[13205]: I0319 09:28:32.889095 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-client-ca\") pod \"1d5e311c-1c6a-4d5d-8c2b-493025593934\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " Mar 19 09:28:32.889364 master-0 kubenswrapper[13205]: I0319 09:28:32.889170 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-config\") pod \"1d5e311c-1c6a-4d5d-8c2b-493025593934\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " Mar 19 09:28:32.889364 master-0 kubenswrapper[13205]: I0319 09:28:32.889196 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-proxy-ca-bundles\") pod \"1d5e311c-1c6a-4d5d-8c2b-493025593934\" (UID: \"1d5e311c-1c6a-4d5d-8c2b-493025593934\") " Mar 19 09:28:32.889950 master-0 kubenswrapper[13205]: I0319 09:28:32.889916 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d5e311c-1c6a-4d5d-8c2b-493025593934" (UID: "1d5e311c-1c6a-4d5d-8c2b-493025593934"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:32.890218 master-0 kubenswrapper[13205]: I0319 09:28:32.890155 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1d5e311c-1c6a-4d5d-8c2b-493025593934" (UID: "1d5e311c-1c6a-4d5d-8c2b-493025593934"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:32.893366 master-0 kubenswrapper[13205]: I0319 09:28:32.891518 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-config" (OuterVolumeSpecName: "config") pod "1d5e311c-1c6a-4d5d-8c2b-493025593934" (UID: "1d5e311c-1c6a-4d5d-8c2b-493025593934"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:32.894386 master-0 kubenswrapper[13205]: I0319 09:28:32.894319 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d5e311c-1c6a-4d5d-8c2b-493025593934-kube-api-access-49fpz" (OuterVolumeSpecName: "kube-api-access-49fpz") pod "1d5e311c-1c6a-4d5d-8c2b-493025593934" (UID: "1d5e311c-1c6a-4d5d-8c2b-493025593934"). InnerVolumeSpecName "kube-api-access-49fpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:28:32.895810 master-0 kubenswrapper[13205]: I0319 09:28:32.895728 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d5e311c-1c6a-4d5d-8c2b-493025593934-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d5e311c-1c6a-4d5d-8c2b-493025593934" (UID: "1d5e311c-1c6a-4d5d-8c2b-493025593934"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:32.991390 master-0 kubenswrapper[13205]: I0319 09:28:32.991295 13205 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d5e311c-1c6a-4d5d-8c2b-493025593934-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:32.991390 master-0 kubenswrapper[13205]: I0319 09:28:32.991366 13205 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:32.991390 master-0 kubenswrapper[13205]: I0319 09:28:32.991380 13205 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:32.991390 master-0 kubenswrapper[13205]: I0319 09:28:32.991392 13205 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d5e311c-1c6a-4d5d-8c2b-493025593934-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:32.991390 master-0 kubenswrapper[13205]: I0319 09:28:32.991405 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49fpz\" (UniqueName: \"kubernetes.io/projected/1d5e311c-1c6a-4d5d-8c2b-493025593934-kube-api-access-49fpz\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:33.762930 master-0 kubenswrapper[13205]: I0319 09:28:33.762774 13205 patch_prober.go:28] interesting pod/oauth-openshift-fd57cd489-6jmpf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.85:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:28:33.762930 master-0 kubenswrapper[13205]: I0319 09:28:33.762864 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" podUID="e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.85:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:28:34.763690 master-0 kubenswrapper[13205]: I0319 09:28:34.763622 13205 patch_prober.go:28] interesting pod/oauth-openshift-fd57cd489-6jmpf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.85:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:28:34.763690 master-0 kubenswrapper[13205]: I0319 09:28:34.763691 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" podUID="e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.85:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:28:34.907474 master-0 kubenswrapper[13205]: I0319 09:28:34.907392 13205 kubelet.go:1505] "Image garbage collection succeeded" Mar 19 09:28:34.935253 master-0 kubenswrapper[13205]: I0319 09:28:34.935205 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 09:28:35.018302 master-0 kubenswrapper[13205]: I0319 09:28:35.018207 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b149c739-203d-4f5a-af11-dba6835ed71d-var-lock\") pod \"b149c739-203d-4f5a-af11-dba6835ed71d\" (UID: \"b149c739-203d-4f5a-af11-dba6835ed71d\") " Mar 19 09:28:35.018559 master-0 kubenswrapper[13205]: I0319 09:28:35.018314 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b149c739-203d-4f5a-af11-dba6835ed71d-kube-api-access\") pod \"b149c739-203d-4f5a-af11-dba6835ed71d\" (UID: \"b149c739-203d-4f5a-af11-dba6835ed71d\") " Mar 19 09:28:35.018559 master-0 kubenswrapper[13205]: I0319 09:28:35.018309 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b149c739-203d-4f5a-af11-dba6835ed71d-var-lock" (OuterVolumeSpecName: "var-lock") pod "b149c739-203d-4f5a-af11-dba6835ed71d" (UID: "b149c739-203d-4f5a-af11-dba6835ed71d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:35.018559 master-0 kubenswrapper[13205]: I0319 09:28:35.018363 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b149c739-203d-4f5a-af11-dba6835ed71d-kubelet-dir\") pod \"b149c739-203d-4f5a-af11-dba6835ed71d\" (UID: \"b149c739-203d-4f5a-af11-dba6835ed71d\") " Mar 19 09:28:35.018559 master-0 kubenswrapper[13205]: I0319 09:28:35.018483 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b149c739-203d-4f5a-af11-dba6835ed71d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b149c739-203d-4f5a-af11-dba6835ed71d" (UID: "b149c739-203d-4f5a-af11-dba6835ed71d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:35.018711 master-0 kubenswrapper[13205]: I0319 09:28:35.018660 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b149c739-203d-4f5a-af11-dba6835ed71d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:35.018711 master-0 kubenswrapper[13205]: I0319 09:28:35.018678 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b149c739-203d-4f5a-af11-dba6835ed71d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:35.021445 master-0 kubenswrapper[13205]: I0319 09:28:35.021396 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b149c739-203d-4f5a-af11-dba6835ed71d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b149c739-203d-4f5a-af11-dba6835ed71d" (UID: "b149c739-203d-4f5a-af11-dba6835ed71d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:28:35.119785 master-0 kubenswrapper[13205]: I0319 09:28:35.119726 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b149c739-203d-4f5a-af11-dba6835ed71d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:35.225689 master-0 kubenswrapper[13205]: E0319 09:28:35.225591 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:28:35.783655 master-0 kubenswrapper[13205]: I0319 09:28:35.783430 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"b149c739-203d-4f5a-af11-dba6835ed71d","Type":"ContainerDied","Data":"d12e946845cda996b442f035801c35ad1cfcaa3e18a6b833bae46e4013d50906"} Mar 19 09:28:35.783655 master-0 kubenswrapper[13205]: I0319 09:28:35.783469 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 09:28:35.783655 master-0 kubenswrapper[13205]: I0319 09:28:35.783499 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d12e946845cda996b442f035801c35ad1cfcaa3e18a6b833bae46e4013d50906" Mar 19 09:28:35.785269 master-0 kubenswrapper[13205]: I0319 09:28:35.785244 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79f67cdc89-bx72w" event={"ID":"5a8e5bd7-de13-4773-8a38-5edf4fda23fd","Type":"ContainerStarted","Data":"3126c808e06276b72f50b1bcec104cc8290fd8a1252c1d1a5a621abc3da492cd"} Mar 19 09:28:37.143797 master-0 kubenswrapper[13205]: I0319 09:28:37.143753 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:37.143797 master-0 kubenswrapper[13205]: I0319 09:28:37.143799 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:28:37.147005 master-0 kubenswrapper[13205]: I0319 09:28:37.146959 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:28:37.147099 master-0 kubenswrapper[13205]: I0319 09:28:37.147025 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:28:44.352092 master-0 kubenswrapper[13205]: E0319 09:28:44.313977 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:28:44.494013 master-0 kubenswrapper[13205]: I0319 09:28:44.493891 13205 patch_prober.go:28] interesting pod/oauth-openshift-fd57cd489-6jmpf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.85:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:28:44.494343 master-0 kubenswrapper[13205]: I0319 09:28:44.494034 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" podUID="e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.85:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:28:45.869976 master-0 kubenswrapper[13205]: I0319 09:28:45.869860 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/2.log" Mar 19 09:28:45.871246 master-0 kubenswrapper[13205]: I0319 09:28:45.871183 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/1.log" Mar 19 09:28:45.873675 master-0 kubenswrapper[13205]: I0319 09:28:45.873610 13205 generic.go:334] "Generic (PLEG): container finished" podID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerID="3694e4ab2ea2f543365f25e3f482176aa9345099d6e0f60c0e896413215ced6f" exitCode=1 Mar 19 09:28:45.873835 master-0 kubenswrapper[13205]: I0319 09:28:45.873688 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerDied","Data":"3694e4ab2ea2f543365f25e3f482176aa9345099d6e0f60c0e896413215ced6f"} Mar 19 09:28:45.873835 master-0 kubenswrapper[13205]: I0319 09:28:45.873755 13205 scope.go:117] "RemoveContainer" containerID="96cfe0cf7dfe0d98d352c2ad678b9567500f91431662731fe6673b6785c78fae" Mar 19 09:28:45.874841 master-0 kubenswrapper[13205]: I0319 09:28:45.874783 13205 scope.go:117] "RemoveContainer" containerID="3694e4ab2ea2f543365f25e3f482176aa9345099d6e0f60c0e896413215ced6f" Mar 19 09:28:45.875404 master-0 kubenswrapper[13205]: E0319 09:28:45.875339 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:28:46.884657 master-0 kubenswrapper[13205]: I0319 09:28:46.884571 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/2.log" Mar 19 09:28:47.143701 master-0 kubenswrapper[13205]: I0319 09:28:47.143558 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:28:47.143701 master-0 kubenswrapper[13205]: I0319 09:28:47.143641 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:28:47.631290 master-0 kubenswrapper[13205]: I0319 09:28:47.631213 13205 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:28:47.632189 master-0 kubenswrapper[13205]: I0319 09:28:47.632154 13205 scope.go:117] "RemoveContainer" containerID="3694e4ab2ea2f543365f25e3f482176aa9345099d6e0f60c0e896413215ced6f" Mar 19 09:28:47.632710 master-0 kubenswrapper[13205]: E0319 09:28:47.632671 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:28:54.013372 master-0 kubenswrapper[13205]: I0319 09:28:54.013271 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:28:54.014392 master-0 kubenswrapper[13205]: I0319 09:28:54.014297 13205 scope.go:117] "RemoveContainer" containerID="3694e4ab2ea2f543365f25e3f482176aa9345099d6e0f60c0e896413215ced6f" Mar 19 09:28:54.015014 master-0 kubenswrapper[13205]: E0319 09:28:54.014956 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:28:54.314858 master-0 kubenswrapper[13205]: E0319 09:28:54.314740 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:28:54.493626 master-0 kubenswrapper[13205]: I0319 09:28:54.493316 13205 patch_prober.go:28] interesting pod/oauth-openshift-fd57cd489-6jmpf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.85:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:28:54.493626 master-0 kubenswrapper[13205]: I0319 09:28:54.493513 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" podUID="e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.85:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:28:54.841289 master-0 kubenswrapper[13205]: I0319 09:28:54.841189 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:28:54.954767 master-0 kubenswrapper[13205]: I0319 09:28:54.954705 13205 scope.go:117] "RemoveContainer" containerID="3694e4ab2ea2f543365f25e3f482176aa9345099d6e0f60c0e896413215ced6f" Mar 19 09:28:54.955108 master-0 kubenswrapper[13205]: E0319 09:28:54.954999 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:28:57.144714 master-0 kubenswrapper[13205]: I0319 09:28:57.144624 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:28:57.144714 master-0 kubenswrapper[13205]: I0319 09:28:57.144710 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:28:57.781889 master-0 kubenswrapper[13205]: I0319 09:28:57.781825 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" podUID="e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" containerName="oauth-openshift" containerID="cri-o://e885c91e3a7232fbbf5adc2d6e560bf02206f0e76f4d1357f37135b4d633b27d" gracePeriod=15 Mar 19 09:29:00.703694 master-0 kubenswrapper[13205]: E0319 09:29:00.702888 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:28:50Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:28:50Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:28:50Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:28:50Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ddc5283caf2ced75a94ddf0e8a43c431889692007e8a875a187b25c35b45a9e2\\\"],\\\"sizeBytes\\\":2895807090},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2a86d5923559588380116772739510b0a665d181819fddbf855acf63cecadb32\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:3721fab205c02b53b35057522b1ebb89ac3643d000d1fc2418aece7d395f7627\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746376668},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c9f7bbe4799eaacbfbb60eb906000d7a813a580d6a9740def7da774cbc4cf859\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cde1da53dadc54c24c10cab8fd3e67839ce68c33ec3b556c255a79167881966a\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252053726},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:9c6279da50a760828b0dabbd6e3baa384cadab3605c4d46e611ea749584e4c4a\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:cffdd23fb5aa53a255c309021bf3d4997520cb803392fa3b6aaa46563a46fb12\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1224180940},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bbb8535e2496de8389585ebbe696e7d7b9bad2b27785ad8a30a0fc683b0a22d\\\"],\\\"sizeBytes\\\":633877280},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98bf5467a01195e20aeea7d6f0b130ddacc00b73bc5312253b8c34e7208538f8\\\"],\\\"sizeBytes\\\":512235769},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:29:02.575110 master-0 kubenswrapper[13205]: I0319 09:29:02.575064 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 09:29:02.577596 master-0 kubenswrapper[13205]: I0319 09:29:02.577503 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 09:29:02.578845 master-0 kubenswrapper[13205]: I0319 09:29:02.578787 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd/0.log" Mar 19 09:29:02.579773 master-0 kubenswrapper[13205]: I0319 09:29:02.579741 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 19 09:29:02.581761 master-0 kubenswrapper[13205]: I0319 09:29:02.581710 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 09:29:02.762204 master-0 kubenswrapper[13205]: I0319 09:29:02.762055 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 09:29:02.762204 master-0 kubenswrapper[13205]: I0319 09:29:02.762164 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 09:29:02.762464 master-0 kubenswrapper[13205]: I0319 09:29:02.762207 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 09:29:02.762464 master-0 kubenswrapper[13205]: I0319 09:29:02.762242 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 09:29:02.762464 master-0 kubenswrapper[13205]: I0319 09:29:02.762309 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 09:29:02.762464 master-0 kubenswrapper[13205]: I0319 09:29:02.762369 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 09:29:02.762813 master-0 kubenswrapper[13205]: I0319 09:29:02.762748 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir" (OuterVolumeSpecName: "log-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:29:02.762879 master-0 kubenswrapper[13205]: I0319 09:29:02.762783 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir" (OuterVolumeSpecName: "data-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:29:02.762879 master-0 kubenswrapper[13205]: I0319 09:29:02.762781 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:29:02.762879 master-0 kubenswrapper[13205]: I0319 09:29:02.762816 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:29:02.762997 master-0 kubenswrapper[13205]: I0319 09:29:02.762886 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:29:02.762997 master-0 kubenswrapper[13205]: I0319 09:29:02.762823 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:29:02.763092 master-0 kubenswrapper[13205]: I0319 09:29:02.763057 13205 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:02.763156 master-0 kubenswrapper[13205]: I0319 09:29:02.763093 13205 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:02.763156 master-0 kubenswrapper[13205]: I0319 09:29:02.763103 13205 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:02.763156 master-0 kubenswrapper[13205]: I0319 09:29:02.763112 13205 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:02.763156 master-0 kubenswrapper[13205]: I0319 09:29:02.763120 13205 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:02.763274 master-0 kubenswrapper[13205]: I0319 09:29:02.763159 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:02.856329 master-0 kubenswrapper[13205]: I0319 09:29:02.856272 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b4ed170d527099878cb5fdd508a2fb" path="/var/lib/kubelet/pods/24b4ed170d527099878cb5fdd508a2fb/volumes" Mar 19 09:29:03.021100 master-0 kubenswrapper[13205]: I0319 09:29:03.021036 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 09:29:03.022582 master-0 kubenswrapper[13205]: I0319 09:29:03.022545 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 09:29:03.023232 master-0 kubenswrapper[13205]: I0319 09:29:03.023198 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd/0.log" Mar 19 09:29:03.023711 master-0 kubenswrapper[13205]: I0319 09:29:03.023679 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 19 09:29:03.024666 master-0 kubenswrapper[13205]: I0319 09:29:03.024624 13205 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb" exitCode=137 Mar 19 09:29:03.024707 master-0 kubenswrapper[13205]: I0319 09:29:03.024662 13205 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51" exitCode=137 Mar 19 09:29:03.024739 master-0 kubenswrapper[13205]: I0319 09:29:03.024715 13205 scope.go:117] "RemoveContainer" containerID="4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492" Mar 19 09:29:03.024915 master-0 kubenswrapper[13205]: I0319 09:29:03.024883 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 09:29:03.052985 master-0 kubenswrapper[13205]: I0319 09:29:03.052934 13205 scope.go:117] "RemoveContainer" containerID="f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6" Mar 19 09:29:03.078043 master-0 kubenswrapper[13205]: I0319 09:29:03.077999 13205 scope.go:117] "RemoveContainer" containerID="5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23" Mar 19 09:29:03.090980 master-0 kubenswrapper[13205]: I0319 09:29:03.090944 13205 scope.go:117] "RemoveContainer" containerID="707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb" Mar 19 09:29:03.108862 master-0 kubenswrapper[13205]: I0319 09:29:03.108815 13205 scope.go:117] "RemoveContainer" containerID="88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51" Mar 19 09:29:03.124559 master-0 kubenswrapper[13205]: I0319 09:29:03.124491 13205 scope.go:117] "RemoveContainer" containerID="d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c" Mar 19 09:29:03.141573 master-0 kubenswrapper[13205]: I0319 09:29:03.141515 13205 scope.go:117] "RemoveContainer" containerID="cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360" Mar 19 09:29:03.157444 master-0 kubenswrapper[13205]: I0319 09:29:03.157385 13205 scope.go:117] "RemoveContainer" containerID="88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0" Mar 19 09:29:03.172509 master-0 kubenswrapper[13205]: I0319 09:29:03.172479 13205 scope.go:117] "RemoveContainer" containerID="4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492" Mar 19 09:29:03.173081 master-0 kubenswrapper[13205]: E0319 09:29:03.173058 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492\": container with ID starting with 4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492 not found: ID does not exist" containerID="4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492" Mar 19 09:29:03.173221 master-0 kubenswrapper[13205]: I0319 09:29:03.173191 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492"} err="failed to get container status \"4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492\": rpc error: code = NotFound desc = could not find container \"4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492\": container with ID starting with 4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492 not found: ID does not exist" Mar 19 09:29:03.173314 master-0 kubenswrapper[13205]: I0319 09:29:03.173296 13205 scope.go:117] "RemoveContainer" containerID="f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6" Mar 19 09:29:03.173801 master-0 kubenswrapper[13205]: E0319 09:29:03.173750 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6\": container with ID starting with f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6 not found: ID does not exist" containerID="f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6" Mar 19 09:29:03.173881 master-0 kubenswrapper[13205]: I0319 09:29:03.173819 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6"} err="failed to get container status \"f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6\": rpc error: code = NotFound desc = could not find container \"f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6\": container with ID starting with f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6 not found: ID does not exist" Mar 19 09:29:03.173881 master-0 kubenswrapper[13205]: I0319 09:29:03.173867 13205 scope.go:117] "RemoveContainer" containerID="5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23" Mar 19 09:29:03.174307 master-0 kubenswrapper[13205]: E0319 09:29:03.174244 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23\": container with ID starting with 5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23 not found: ID does not exist" containerID="5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23" Mar 19 09:29:03.174371 master-0 kubenswrapper[13205]: I0319 09:29:03.174312 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23"} err="failed to get container status \"5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23\": rpc error: code = NotFound desc = could not find container \"5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23\": container with ID starting with 5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23 not found: ID does not exist" Mar 19 09:29:03.174371 master-0 kubenswrapper[13205]: I0319 09:29:03.174346 13205 scope.go:117] "RemoveContainer" containerID="707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb" Mar 19 09:29:03.174667 master-0 kubenswrapper[13205]: E0319 09:29:03.174635 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb\": container with ID starting with 707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb not found: ID does not exist" containerID="707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb" Mar 19 09:29:03.174735 master-0 kubenswrapper[13205]: I0319 09:29:03.174662 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb"} err="failed to get container status \"707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb\": rpc error: code = NotFound desc = could not find container \"707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb\": container with ID starting with 707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb not found: ID does not exist" Mar 19 09:29:03.174735 master-0 kubenswrapper[13205]: I0319 09:29:03.174682 13205 scope.go:117] "RemoveContainer" containerID="88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51" Mar 19 09:29:03.175084 master-0 kubenswrapper[13205]: E0319 09:29:03.175027 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51\": container with ID starting with 88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51 not found: ID does not exist" containerID="88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51" Mar 19 09:29:03.175149 master-0 kubenswrapper[13205]: I0319 09:29:03.175095 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51"} err="failed to get container status \"88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51\": rpc error: code = NotFound desc = could not find container \"88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51\": container with ID starting with 88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51 not found: ID does not exist" Mar 19 09:29:03.175149 master-0 kubenswrapper[13205]: I0319 09:29:03.175131 13205 scope.go:117] "RemoveContainer" containerID="d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c" Mar 19 09:29:03.175435 master-0 kubenswrapper[13205]: E0319 09:29:03.175409 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c\": container with ID starting with d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c not found: ID does not exist" containerID="d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c" Mar 19 09:29:03.175500 master-0 kubenswrapper[13205]: I0319 09:29:03.175443 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c"} err="failed to get container status \"d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c\": rpc error: code = NotFound desc = could not find container \"d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c\": container with ID starting with d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c not found: ID does not exist" Mar 19 09:29:03.175500 master-0 kubenswrapper[13205]: I0319 09:29:03.175462 13205 scope.go:117] "RemoveContainer" containerID="cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360" Mar 19 09:29:03.175890 master-0 kubenswrapper[13205]: E0319 09:29:03.175789 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360\": container with ID starting with cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360 not found: ID does not exist" containerID="cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360" Mar 19 09:29:03.175890 master-0 kubenswrapper[13205]: I0319 09:29:03.175845 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360"} err="failed to get container status \"cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360\": rpc error: code = NotFound desc = could not find container \"cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360\": container with ID starting with cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360 not found: ID does not exist" Mar 19 09:29:03.175982 master-0 kubenswrapper[13205]: I0319 09:29:03.175885 13205 scope.go:117] "RemoveContainer" containerID="88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0" Mar 19 09:29:03.176240 master-0 kubenswrapper[13205]: E0319 09:29:03.176205 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0\": container with ID starting with 88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0 not found: ID does not exist" containerID="88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0" Mar 19 09:29:03.176303 master-0 kubenswrapper[13205]: I0319 09:29:03.176234 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0"} err="failed to get container status \"88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0\": rpc error: code = NotFound desc = could not find container \"88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0\": container with ID starting with 88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0 not found: ID does not exist" Mar 19 09:29:03.176303 master-0 kubenswrapper[13205]: I0319 09:29:03.176254 13205 scope.go:117] "RemoveContainer" containerID="4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492" Mar 19 09:29:03.176612 master-0 kubenswrapper[13205]: I0319 09:29:03.176559 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492"} err="failed to get container status \"4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492\": rpc error: code = NotFound desc = could not find container \"4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492\": container with ID starting with 4553436fdfb20c39b8b6f4fe48d3aa7a1406a03229db1847b0f6fd70a10eb492 not found: ID does not exist" Mar 19 09:29:03.176678 master-0 kubenswrapper[13205]: I0319 09:29:03.176618 13205 scope.go:117] "RemoveContainer" containerID="f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6" Mar 19 09:29:03.176935 master-0 kubenswrapper[13205]: I0319 09:29:03.176901 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6"} err="failed to get container status \"f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6\": rpc error: code = NotFound desc = could not find container \"f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6\": container with ID starting with f94c768e05cc599fb390104f803b849ae114d571a7f0ccafbd4fa8f02d4174f6 not found: ID does not exist" Mar 19 09:29:03.176935 master-0 kubenswrapper[13205]: I0319 09:29:03.176923 13205 scope.go:117] "RemoveContainer" containerID="5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23" Mar 19 09:29:03.177273 master-0 kubenswrapper[13205]: I0319 09:29:03.177207 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23"} err="failed to get container status \"5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23\": rpc error: code = NotFound desc = could not find container \"5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23\": container with ID starting with 5dd2d21d5dd45e876ad2579095d61fae7f8146198e1afc805b530adca7b39c23 not found: ID does not exist" Mar 19 09:29:03.177339 master-0 kubenswrapper[13205]: I0319 09:29:03.177271 13205 scope.go:117] "RemoveContainer" containerID="707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb" Mar 19 09:29:03.177775 master-0 kubenswrapper[13205]: I0319 09:29:03.177735 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb"} err="failed to get container status \"707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb\": rpc error: code = NotFound desc = could not find container \"707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb\": container with ID starting with 707e3e6322f0c91c19da9fa7884cd0618714bd7ea882770d56ca80136b6139cb not found: ID does not exist" Mar 19 09:29:03.177775 master-0 kubenswrapper[13205]: I0319 09:29:03.177757 13205 scope.go:117] "RemoveContainer" containerID="88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51" Mar 19 09:29:03.178120 master-0 kubenswrapper[13205]: I0319 09:29:03.178050 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51"} err="failed to get container status \"88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51\": rpc error: code = NotFound desc = could not find container \"88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51\": container with ID starting with 88f87b623e4eefd30e85575f26fa3e04ca23a434ddb43fc54031804dde468b51 not found: ID does not exist" Mar 19 09:29:03.178198 master-0 kubenswrapper[13205]: I0319 09:29:03.178117 13205 scope.go:117] "RemoveContainer" containerID="d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c" Mar 19 09:29:03.178454 master-0 kubenswrapper[13205]: I0319 09:29:03.178430 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c"} err="failed to get container status \"d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c\": rpc error: code = NotFound desc = could not find container \"d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c\": container with ID starting with d7623b95d4aa672ff3501a212183be0b8615e8509677c5254871e8f275cfe75c not found: ID does not exist" Mar 19 09:29:03.178585 master-0 kubenswrapper[13205]: I0319 09:29:03.178570 13205 scope.go:117] "RemoveContainer" containerID="cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360" Mar 19 09:29:03.179033 master-0 kubenswrapper[13205]: I0319 09:29:03.179002 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360"} err="failed to get container status \"cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360\": rpc error: code = NotFound desc = could not find container \"cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360\": container with ID starting with cd53f37e2cd7ebc84d8abc7f8891da9db0db9b8fb6e584c3d9296e1adff5c360 not found: ID does not exist" Mar 19 09:29:03.179033 master-0 kubenswrapper[13205]: I0319 09:29:03.179026 13205 scope.go:117] "RemoveContainer" containerID="88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0" Mar 19 09:29:03.179387 master-0 kubenswrapper[13205]: I0319 09:29:03.179321 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0"} err="failed to get container status \"88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0\": rpc error: code = NotFound desc = could not find container \"88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0\": container with ID starting with 88a201221923d33609cff28d46a6a9288c3b75709464163cc0b8e34d328504e0 not found: ID does not exist" Mar 19 09:29:04.315549 master-0 kubenswrapper[13205]: E0319 09:29:04.315473 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:29:04.494437 master-0 kubenswrapper[13205]: I0319 09:29:04.494367 13205 patch_prober.go:28] interesting pod/oauth-openshift-fd57cd489-6jmpf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.85:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:29:04.494908 master-0 kubenswrapper[13205]: I0319 09:29:04.494838 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" podUID="e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.85:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:29:05.920732 master-0 kubenswrapper[13205]: E0319 09:29:05.919960 13205 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{route-controller-manager-8555fbf585-9ggfr.189e3407eb15c8ec openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-8555fbf585-9ggfr,UID:67d66357-fcee-4e70-b563-5895b978ab55,APIVersion:v1,ResourceVersion:8855,FieldPath:spec.containers{route-controller-manager},},Reason:Killing,Message:Stopping container route-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:28:31.844042988 +0000 UTC m=+297.176349876,LastTimestamp:2026-03-19 09:28:31.844042988 +0000 UTC m=+297.176349876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:29:07.144424 master-0 kubenswrapper[13205]: I0319 09:29:07.144335 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:29:07.144424 master-0 kubenswrapper[13205]: I0319 09:29:07.144412 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:29:08.849737 master-0 kubenswrapper[13205]: I0319 09:29:08.849642 13205 scope.go:117] "RemoveContainer" containerID="3694e4ab2ea2f543365f25e3f482176aa9345099d6e0f60c0e896413215ced6f" Mar 19 09:29:10.096994 master-0 kubenswrapper[13205]: I0319 09:29:10.096843 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/2.log" Mar 19 09:29:10.099169 master-0 kubenswrapper[13205]: I0319 09:29:10.099036 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"376660e439ae8f2c02ca4c362c919c890d0ad33fc46053d8caa8f4c5abc8c5d8"} Mar 19 09:29:10.704769 master-0 kubenswrapper[13205]: E0319 09:29:10.704679 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:29:10.849214 master-0 kubenswrapper[13205]: I0319 09:29:10.849072 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 09:29:10.877992 master-0 kubenswrapper[13205]: I0319 09:29:10.877892 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="01f1f75a-c49b-4538-b629-e7911d4945f6" Mar 19 09:29:10.877992 master-0 kubenswrapper[13205]: I0319 09:29:10.877947 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="01f1f75a-c49b-4538-b629-e7911d4945f6" Mar 19 09:29:13.126763 master-0 kubenswrapper[13205]: I0319 09:29:13.126646 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-fd57cd489-6jmpf_e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5/oauth-openshift/0.log" Mar 19 09:29:13.126763 master-0 kubenswrapper[13205]: I0319 09:29:13.126741 13205 generic.go:334] "Generic (PLEG): container finished" podID="e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" containerID="e885c91e3a7232fbbf5adc2d6e560bf02206f0e76f4d1357f37135b4d633b27d" exitCode=137 Mar 19 09:29:13.127288 master-0 kubenswrapper[13205]: I0319 09:29:13.126792 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" event={"ID":"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5","Type":"ContainerDied","Data":"e885c91e3a7232fbbf5adc2d6e560bf02206f0e76f4d1357f37135b4d633b27d"} Mar 19 09:29:13.723026 master-0 kubenswrapper[13205]: I0319 09:29:13.722960 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-fd57cd489-6jmpf_e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5/oauth-openshift/0.log" Mar 19 09:29:13.723270 master-0 kubenswrapper[13205]: I0319 09:29:13.723057 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.748460 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xxcg\" (UniqueName: \"kubernetes.io/projected/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-kube-api-access-7xxcg\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.748570 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-login\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.748653 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-audit-policies\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.748695 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-serving-cert\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.748733 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-service-ca\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.748789 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-session\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.748820 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-audit-dir\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.748863 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.748910 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-trusted-ca-bundle\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.748951 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-provider-selection\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.748987 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-router-certs\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.749039 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-error\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.749066 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-ocp-branding-template\") pod \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\" (UID: \"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5\") " Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.751054 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.751368 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:29:13.752499 master-0 kubenswrapper[13205]: I0319 09:29:13.751917 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:29:13.753327 master-0 kubenswrapper[13205]: I0319 09:29:13.752851 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:29:13.753327 master-0 kubenswrapper[13205]: I0319 09:29:13.752867 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:29:13.753559 master-0 kubenswrapper[13205]: I0319 09:29:13.753509 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:29:13.754302 master-0 kubenswrapper[13205]: I0319 09:29:13.754256 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-kube-api-access-7xxcg" (OuterVolumeSpecName: "kube-api-access-7xxcg") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "kube-api-access-7xxcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:29:13.754480 master-0 kubenswrapper[13205]: I0319 09:29:13.754454 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:29:13.754995 master-0 kubenswrapper[13205]: I0319 09:29:13.754962 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:29:13.755175 master-0 kubenswrapper[13205]: I0319 09:29:13.755136 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:29:13.755411 master-0 kubenswrapper[13205]: I0319 09:29:13.755375 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:29:13.757051 master-0 kubenswrapper[13205]: I0319 09:29:13.756907 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:29:13.757145 master-0 kubenswrapper[13205]: I0319 09:29:13.757109 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" (UID: "e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:29:13.851335 master-0 kubenswrapper[13205]: I0319 09:29:13.851266 13205 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:13.851335 master-0 kubenswrapper[13205]: I0319 09:29:13.851324 13205 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:13.851598 master-0 kubenswrapper[13205]: I0319 09:29:13.851349 13205 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:13.851598 master-0 kubenswrapper[13205]: I0319 09:29:13.851370 13205 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:13.851598 master-0 kubenswrapper[13205]: I0319 09:29:13.851390 13205 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:13.851598 master-0 kubenswrapper[13205]: I0319 09:29:13.851411 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xxcg\" (UniqueName: \"kubernetes.io/projected/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-kube-api-access-7xxcg\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:13.851598 master-0 kubenswrapper[13205]: I0319 09:29:13.851431 13205 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:13.851598 master-0 kubenswrapper[13205]: I0319 09:29:13.851453 13205 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:13.851598 master-0 kubenswrapper[13205]: I0319 09:29:13.851471 13205 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:13.851598 master-0 kubenswrapper[13205]: I0319 09:29:13.851491 13205 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:13.851598 master-0 kubenswrapper[13205]: I0319 09:29:13.851509 13205 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:13.851598 master-0 kubenswrapper[13205]: I0319 09:29:13.851550 13205 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:13.851598 master-0 kubenswrapper[13205]: I0319 09:29:13.851570 13205 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:14.013392 master-0 kubenswrapper[13205]: I0319 09:29:14.013186 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:29:14.022130 master-0 kubenswrapper[13205]: I0319 09:29:14.022036 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:29:14.136584 master-0 kubenswrapper[13205]: I0319 09:29:14.136511 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-fd57cd489-6jmpf_e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5/oauth-openshift/0.log" Mar 19 09:29:14.137119 master-0 kubenswrapper[13205]: I0319 09:29:14.136678 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" Mar 19 09:29:14.137119 master-0 kubenswrapper[13205]: I0319 09:29:14.136660 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" event={"ID":"e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5","Type":"ContainerDied","Data":"126ac4435cf417f6c704f4b02df9d702c9bd43bddbef0b99b8aeeb2f414a1645"} Mar 19 09:29:14.137119 master-0 kubenswrapper[13205]: I0319 09:29:14.136862 13205 scope.go:117] "RemoveContainer" containerID="e885c91e3a7232fbbf5adc2d6e560bf02206f0e76f4d1357f37135b4d633b27d" Mar 19 09:29:14.137237 master-0 kubenswrapper[13205]: I0319 09:29:14.137160 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:29:14.316804 master-0 kubenswrapper[13205]: E0319 09:29:14.316717 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:29:14.493870 master-0 kubenswrapper[13205]: I0319 09:29:14.493784 13205 patch_prober.go:28] interesting pod/oauth-openshift-fd57cd489-6jmpf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.85:6443/healthz\": dial tcp 10.128.0.85:6443: i/o timeout" start-of-body= Mar 19 09:29:14.493870 master-0 kubenswrapper[13205]: I0319 09:29:14.493856 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-fd57cd489-6jmpf" podUID="e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.85:6443/healthz\": dial tcp 10.128.0.85:6443: i/o timeout" Mar 19 09:29:17.144477 master-0 kubenswrapper[13205]: I0319 09:29:17.144384 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:29:17.145178 master-0 kubenswrapper[13205]: I0319 09:29:17.144493 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:29:20.705303 master-0 kubenswrapper[13205]: E0319 09:29:20.705250 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:29:24.317465 master-0 kubenswrapper[13205]: E0319 09:29:24.317370 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:29:24.317465 master-0 kubenswrapper[13205]: I0319 09:29:24.317467 13205 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 09:29:24.844962 master-0 kubenswrapper[13205]: I0319 09:29:24.844905 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:29:27.144631 master-0 kubenswrapper[13205]: I0319 09:29:27.144513 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:29:27.144631 master-0 kubenswrapper[13205]: I0319 09:29:27.144633 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:29:30.707127 master-0 kubenswrapper[13205]: E0319 09:29:30.707063 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:29:32.166640 master-0 kubenswrapper[13205]: E0319 09:29:32.166575 13205 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 09:29:32.166640 master-0 kubenswrapper[13205]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4_0(99adcdec16f0f1f6f77f5376c8f4847503dcbe84b6bd2a20147fbf8e26f8311f): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"99adcdec16f0f1f6f77f5376c8f4847503dcbe84b6bd2a20147fbf8e26f8311f" Netns:"/var/run/netns/bd8ea498-9043-4b63-be18-7af03795b22d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=99adcdec16f0f1f6f77f5376c8f4847503dcbe84b6bd2a20147fbf8e26f8311f;K8S_POD_UID=e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:29:32.166640 master-0 kubenswrapper[13205]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:29:32.166640 master-0 kubenswrapper[13205]: > Mar 19 09:29:32.167555 master-0 kubenswrapper[13205]: E0319 09:29:32.166681 13205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 09:29:32.167555 master-0 kubenswrapper[13205]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4_0(99adcdec16f0f1f6f77f5376c8f4847503dcbe84b6bd2a20147fbf8e26f8311f): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"99adcdec16f0f1f6f77f5376c8f4847503dcbe84b6bd2a20147fbf8e26f8311f" Netns:"/var/run/netns/bd8ea498-9043-4b63-be18-7af03795b22d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=99adcdec16f0f1f6f77f5376c8f4847503dcbe84b6bd2a20147fbf8e26f8311f;K8S_POD_UID=e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:29:32.167555 master-0 kubenswrapper[13205]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:29:32.167555 master-0 kubenswrapper[13205]: > pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:29:32.167555 master-0 kubenswrapper[13205]: E0319 09:29:32.166715 13205 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 09:29:32.167555 master-0 kubenswrapper[13205]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4_0(99adcdec16f0f1f6f77f5376c8f4847503dcbe84b6bd2a20147fbf8e26f8311f): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"99adcdec16f0f1f6f77f5376c8f4847503dcbe84b6bd2a20147fbf8e26f8311f" Netns:"/var/run/netns/bd8ea498-9043-4b63-be18-7af03795b22d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=99adcdec16f0f1f6f77f5376c8f4847503dcbe84b6bd2a20147fbf8e26f8311f;K8S_POD_UID=e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:29:32.167555 master-0 kubenswrapper[13205]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:29:32.167555 master-0 kubenswrapper[13205]: > pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:29:32.167555 master-0 kubenswrapper[13205]: E0319 09:29:32.166820 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-2-master-0_openshift-kube-controller-manager(e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-2-master-0_openshift-kube-controller-manager(e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4_0(99adcdec16f0f1f6f77f5376c8f4847503dcbe84b6bd2a20147fbf8e26f8311f): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"99adcdec16f0f1f6f77f5376c8f4847503dcbe84b6bd2a20147fbf8e26f8311f\\\" Netns:\\\"/var/run/netns/bd8ea498-9043-4b63-be18-7af03795b22d\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=99adcdec16f0f1f6f77f5376c8f4847503dcbe84b6bd2a20147fbf8e26f8311f;K8S_POD_UID=e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-controller-manager/installer-2-master-0" podUID="e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" Mar 19 09:29:32.277149 master-0 kubenswrapper[13205]: I0319 09:29:32.277078 13205 status_manager.go:851] "Failed to get status for pod" podUID="67d66357-fcee-4e70-b563-5895b978ab55" pod="openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods route-controller-manager-8555fbf585-9ggfr)" Mar 19 09:29:32.278343 master-0 kubenswrapper[13205]: I0319 09:29:32.278280 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:29:32.281623 master-0 kubenswrapper[13205]: I0319 09:29:32.281569 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:29:34.317807 master-0 kubenswrapper[13205]: E0319 09:29:34.317692 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="200ms" Mar 19 09:29:35.210843 master-0 kubenswrapper[13205]: E0319 09:29:35.210777 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:29:36.309945 master-0 kubenswrapper[13205]: I0319 09:29:36.309853 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-slmgx_58ea8fcc-29b2-48ef-8629-2ba217c9d70c/approver/1.log" Mar 19 09:29:36.311492 master-0 kubenswrapper[13205]: I0319 09:29:36.311435 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-slmgx_58ea8fcc-29b2-48ef-8629-2ba217c9d70c/approver/0.log" Mar 19 09:29:36.312180 master-0 kubenswrapper[13205]: I0319 09:29:36.312108 13205 generic.go:334] "Generic (PLEG): container finished" podID="58ea8fcc-29b2-48ef-8629-2ba217c9d70c" containerID="0c9bb6f28236e5e26577492918d9d691fd6d1f78a2da9cc0727e44bdd383f7c9" exitCode=1 Mar 19 09:29:36.312320 master-0 kubenswrapper[13205]: I0319 09:29:36.312183 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-slmgx" event={"ID":"58ea8fcc-29b2-48ef-8629-2ba217c9d70c","Type":"ContainerDied","Data":"0c9bb6f28236e5e26577492918d9d691fd6d1f78a2da9cc0727e44bdd383f7c9"} Mar 19 09:29:36.312320 master-0 kubenswrapper[13205]: I0319 09:29:36.312252 13205 scope.go:117] "RemoveContainer" containerID="16dabbfac23a88b18e7a1e5f639f318226358e768cd4e0f4bf6b8327e7b845c9" Mar 19 09:29:36.313113 master-0 kubenswrapper[13205]: I0319 09:29:36.313059 13205 scope.go:117] "RemoveContainer" containerID="0c9bb6f28236e5e26577492918d9d691fd6d1f78a2da9cc0727e44bdd383f7c9" Mar 19 09:29:37.144876 master-0 kubenswrapper[13205]: I0319 09:29:37.144806 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:29:37.145576 master-0 kubenswrapper[13205]: I0319 09:29:37.145472 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:29:37.322711 master-0 kubenswrapper[13205]: I0319 09:29:37.322626 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-slmgx_58ea8fcc-29b2-48ef-8629-2ba217c9d70c/approver/1.log" Mar 19 09:29:37.323721 master-0 kubenswrapper[13205]: I0319 09:29:37.323253 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-slmgx" event={"ID":"58ea8fcc-29b2-48ef-8629-2ba217c9d70c","Type":"ContainerStarted","Data":"423f79d4508892abdf24ba7ef41633498ef03fdc816565927a00dde38fc06f6c"} Mar 19 09:29:39.923796 master-0 kubenswrapper[13205]: E0319 09:29:39.923619 13205 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e3407ebe15810 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Killing,Message:Stopping container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:28:31.85738344 +0000 UTC m=+297.189690338,LastTimestamp:2026-03-19 09:28:31.85738344 +0000 UTC m=+297.189690338,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:29:40.709364 master-0 kubenswrapper[13205]: E0319 09:29:40.709220 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (get nodes master-0)" Mar 19 09:29:40.709364 master-0 kubenswrapper[13205]: E0319 09:29:40.709286 13205 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:29:44.518554 master-0 kubenswrapper[13205]: E0319 09:29:44.518400 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 19 09:29:44.881410 master-0 kubenswrapper[13205]: E0319 09:29:44.881253 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 09:29:44.881959 master-0 kubenswrapper[13205]: I0319 09:29:44.881920 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 09:29:44.914937 master-0 kubenswrapper[13205]: W0319 09:29:44.914829 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod094204df314fe45bd5af12ca1b4622bb.slice/crio-4e3b5c008471dcf2740f0f5204969b445634731caca579ddcbe0ceb8ffdfe343 WatchSource:0}: Error finding container 4e3b5c008471dcf2740f0f5204969b445634731caca579ddcbe0ceb8ffdfe343: Status 404 returned error can't find the container with id 4e3b5c008471dcf2740f0f5204969b445634731caca579ddcbe0ceb8ffdfe343 Mar 19 09:29:45.394323 master-0 kubenswrapper[13205]: I0319 09:29:45.394249 13205 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="03db9d08dcb39542cdc2275eeec986fdbc8b996cc2a97e709c569b27e770921e" exitCode=0 Mar 19 09:29:45.394323 master-0 kubenswrapper[13205]: I0319 09:29:45.394310 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"03db9d08dcb39542cdc2275eeec986fdbc8b996cc2a97e709c569b27e770921e"} Mar 19 09:29:45.394898 master-0 kubenswrapper[13205]: I0319 09:29:45.394352 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"4e3b5c008471dcf2740f0f5204969b445634731caca579ddcbe0ceb8ffdfe343"} Mar 19 09:29:45.394898 master-0 kubenswrapper[13205]: I0319 09:29:45.394857 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="01f1f75a-c49b-4538-b629-e7911d4945f6" Mar 19 09:29:45.394898 master-0 kubenswrapper[13205]: I0319 09:29:45.394883 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="01f1f75a-c49b-4538-b629-e7911d4945f6" Mar 19 09:29:47.144232 master-0 kubenswrapper[13205]: I0319 09:29:47.144116 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:29:47.144938 master-0 kubenswrapper[13205]: I0319 09:29:47.144263 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:29:54.919464 master-0 kubenswrapper[13205]: E0319 09:29:54.919337 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 19 09:29:57.144507 master-0 kubenswrapper[13205]: I0319 09:29:57.144381 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:29:57.144507 master-0 kubenswrapper[13205]: I0319 09:29:57.144489 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:30:00.941769 master-0 kubenswrapper[13205]: E0319 09:30:00.941548 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:29:50Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:29:50Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:29:50Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:29:50Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ddc5283caf2ced75a94ddf0e8a43c431889692007e8a875a187b25c35b45a9e2\\\"],\\\"sizeBytes\\\":2895807090},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2a86d5923559588380116772739510b0a665d181819fddbf855acf63cecadb32\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:3721fab205c02b53b35057522b1ebb89ac3643d000d1fc2418aece7d395f7627\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746376668},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c9f7bbe4799eaacbfbb60eb906000d7a813a580d6a9740def7da774cbc4cf859\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cde1da53dadc54c24c10cab8fd3e67839ce68c33ec3b556c255a79167881966a\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252053726},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:9c6279da50a760828b0dabbd6e3baa384cadab3605c4d46e611ea749584e4c4a\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:cffdd23fb5aa53a255c309021bf3d4997520cb803392fa3b6aaa46563a46fb12\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1224180940},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bbb8535e2496de8389585ebbe696e7d7b9bad2b27785ad8a30a0fc683b0a22d\\\"],\\\"sizeBytes\\\":633877280},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98bf5467a01195e20aeea7d6f0b130ddacc00b73bc5312253b8c34e7208538f8\\\"],\\\"sizeBytes\\\":512235769},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": context deadline exceeded" Mar 19 09:30:05.721413 master-0 kubenswrapper[13205]: E0319 09:30:05.721270 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 19 09:30:07.144691 master-0 kubenswrapper[13205]: I0319 09:30:07.144608 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:30:07.145648 master-0 kubenswrapper[13205]: I0319 09:30:07.144700 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:30:10.942750 master-0 kubenswrapper[13205]: E0319 09:30:10.942641 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:30:13.926712 master-0 kubenswrapper[13205]: E0319 09:30:13.926570 13205 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e3407ebe34673 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Killing,Message:Stopping container etcd-metrics,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:28:31.857510003 +0000 UTC m=+297.189816901,LastTimestamp:2026-03-19 09:28:31.857510003 +0000 UTC m=+297.189816901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:30:17.143862 master-0 kubenswrapper[13205]: I0319 09:30:17.143812 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:30:17.143862 master-0 kubenswrapper[13205]: I0319 09:30:17.143875 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:30:17.323277 master-0 kubenswrapper[13205]: E0319 09:30:17.323123 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 19 09:30:19.398569 master-0 kubenswrapper[13205]: E0319 09:30:19.398421 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 09:30:20.686990 master-0 kubenswrapper[13205]: I0319 09:30:20.686875 13205 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="7add10caa6d5266b75a7a7ed541cb7c4b7e29d67413f26535f7d78ad880df73f" exitCode=0 Mar 19 09:30:20.686990 master-0 kubenswrapper[13205]: I0319 09:30:20.686963 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"7add10caa6d5266b75a7a7ed541cb7c4b7e29d67413f26535f7d78ad880df73f"} Mar 19 09:30:20.688141 master-0 kubenswrapper[13205]: I0319 09:30:20.687588 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="01f1f75a-c49b-4538-b629-e7911d4945f6" Mar 19 09:30:20.688141 master-0 kubenswrapper[13205]: I0319 09:30:20.687615 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="01f1f75a-c49b-4538-b629-e7911d4945f6" Mar 19 09:30:20.944830 master-0 kubenswrapper[13205]: E0319 09:30:20.942836 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Mar 19 09:30:23.717735 master-0 kubenswrapper[13205]: I0319 09:30:23.717612 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-r28hm_3374940a-612d-4335-8236-3ffe8d6e73a5/manager/0.log" Mar 19 09:30:23.718440 master-0 kubenswrapper[13205]: I0319 09:30:23.718367 13205 generic.go:334] "Generic (PLEG): container finished" podID="3374940a-612d-4335-8236-3ffe8d6e73a5" containerID="b1150c5ddc8f3dad3084433acf3e72b1db9c58ad0b6290a41f9524f5639d8b9c" exitCode=1 Mar 19 09:30:23.718554 master-0 kubenswrapper[13205]: I0319 09:30:23.718480 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" event={"ID":"3374940a-612d-4335-8236-3ffe8d6e73a5","Type":"ContainerDied","Data":"b1150c5ddc8f3dad3084433acf3e72b1db9c58ad0b6290a41f9524f5639d8b9c"} Mar 19 09:30:23.720019 master-0 kubenswrapper[13205]: I0319 09:30:23.719960 13205 scope.go:117] "RemoveContainer" containerID="b1150c5ddc8f3dad3084433acf3e72b1db9c58ad0b6290a41f9524f5639d8b9c" Mar 19 09:30:23.722403 master-0 kubenswrapper[13205]: I0319 09:30:23.722351 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-ft7tt_f585ebb1-6210-463b-af85-fb29e1e7dfa5/manager/0.log" Mar 19 09:30:23.722487 master-0 kubenswrapper[13205]: I0319 09:30:23.722431 13205 generic.go:334] "Generic (PLEG): container finished" podID="f585ebb1-6210-463b-af85-fb29e1e7dfa5" containerID="76f3de4c762cae478a577d1d16dfb1ee4af5fa68c3f21bb2b3efce645591fdc4" exitCode=1 Mar 19 09:30:23.722621 master-0 kubenswrapper[13205]: I0319 09:30:23.722541 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" event={"ID":"f585ebb1-6210-463b-af85-fb29e1e7dfa5","Type":"ContainerDied","Data":"76f3de4c762cae478a577d1d16dfb1ee4af5fa68c3f21bb2b3efce645591fdc4"} Mar 19 09:30:23.723304 master-0 kubenswrapper[13205]: I0319 09:30:23.723274 13205 scope.go:117] "RemoveContainer" containerID="76f3de4c762cae478a577d1d16dfb1ee4af5fa68c3f21bb2b3efce645591fdc4" Mar 19 09:30:23.725653 master-0 kubenswrapper[13205]: I0319 09:30:23.725609 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/0.log" Mar 19 09:30:23.725726 master-0 kubenswrapper[13205]: I0319 09:30:23.725658 13205 generic.go:334] "Generic (PLEG): container finished" podID="dc65ec1f-b8fb-40d6-ac39-46b255a33221" containerID="5732fa3ff6aaea0289273acea825bfaab46efed575658d801c96fd54df3453e0" exitCode=1 Mar 19 09:30:23.725786 master-0 kubenswrapper[13205]: I0319 09:30:23.725701 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" event={"ID":"dc65ec1f-b8fb-40d6-ac39-46b255a33221","Type":"ContainerDied","Data":"5732fa3ff6aaea0289273acea825bfaab46efed575658d801c96fd54df3453e0"} Mar 19 09:30:23.726381 master-0 kubenswrapper[13205]: I0319 09:30:23.726352 13205 scope.go:117] "RemoveContainer" containerID="5732fa3ff6aaea0289273acea825bfaab46efed575658d801c96fd54df3453e0" Mar 19 09:30:24.738637 master-0 kubenswrapper[13205]: I0319 09:30:24.738450 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-ft7tt_f585ebb1-6210-463b-af85-fb29e1e7dfa5/manager/0.log" Mar 19 09:30:24.739479 master-0 kubenswrapper[13205]: I0319 09:30:24.738603 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" event={"ID":"f585ebb1-6210-463b-af85-fb29e1e7dfa5","Type":"ContainerStarted","Data":"27f52ebb959a6f7770709a341d537c3b74f718075565d802b01ecb3ff85c2529"} Mar 19 09:30:24.739479 master-0 kubenswrapper[13205]: I0319 09:30:24.739121 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:30:24.741172 master-0 kubenswrapper[13205]: I0319 09:30:24.741123 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-r2cs7_9e10cb6e-5703-4e4d-a82b-f6de34888b65/config-sync-controllers/0.log" Mar 19 09:30:24.742108 master-0 kubenswrapper[13205]: I0319 09:30:24.742045 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-r2cs7_9e10cb6e-5703-4e4d-a82b-f6de34888b65/cluster-cloud-controller-manager/0.log" Mar 19 09:30:24.742207 master-0 kubenswrapper[13205]: I0319 09:30:24.742144 13205 generic.go:334] "Generic (PLEG): container finished" podID="9e10cb6e-5703-4e4d-a82b-f6de34888b65" containerID="041e723992540922436e90b5d9095bce07f4956cd2e2523edb1bcdbcbea31c25" exitCode=1 Mar 19 09:30:24.742207 master-0 kubenswrapper[13205]: I0319 09:30:24.742179 13205 generic.go:334] "Generic (PLEG): container finished" podID="9e10cb6e-5703-4e4d-a82b-f6de34888b65" containerID="db787764635a7c7132d16869134e3fbb91501f635bc88c71e3e87ec410b2b532" exitCode=1 Mar 19 09:30:24.742347 master-0 kubenswrapper[13205]: I0319 09:30:24.742199 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" event={"ID":"9e10cb6e-5703-4e4d-a82b-f6de34888b65","Type":"ContainerDied","Data":"041e723992540922436e90b5d9095bce07f4956cd2e2523edb1bcdbcbea31c25"} Mar 19 09:30:24.742347 master-0 kubenswrapper[13205]: I0319 09:30:24.742268 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" event={"ID":"9e10cb6e-5703-4e4d-a82b-f6de34888b65","Type":"ContainerDied","Data":"db787764635a7c7132d16869134e3fbb91501f635bc88c71e3e87ec410b2b532"} Mar 19 09:30:24.743234 master-0 kubenswrapper[13205]: I0319 09:30:24.743170 13205 scope.go:117] "RemoveContainer" containerID="db787764635a7c7132d16869134e3fbb91501f635bc88c71e3e87ec410b2b532" Mar 19 09:30:24.743336 master-0 kubenswrapper[13205]: I0319 09:30:24.743238 13205 scope.go:117] "RemoveContainer" containerID="041e723992540922436e90b5d9095bce07f4956cd2e2523edb1bcdbcbea31c25" Mar 19 09:30:24.744109 master-0 kubenswrapper[13205]: I0319 09:30:24.744060 13205 generic.go:334] "Generic (PLEG): container finished" podID="dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e" containerID="d4be4fce578d40cec533c0ab0b2ea7b1d2f8bbfad85eab154b0c3268083f1916" exitCode=0 Mar 19 09:30:24.744203 master-0 kubenswrapper[13205]: I0319 09:30:24.744124 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" event={"ID":"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e","Type":"ContainerDied","Data":"d4be4fce578d40cec533c0ab0b2ea7b1d2f8bbfad85eab154b0c3268083f1916"} Mar 19 09:30:24.744925 master-0 kubenswrapper[13205]: I0319 09:30:24.744444 13205 scope.go:117] "RemoveContainer" containerID="d4be4fce578d40cec533c0ab0b2ea7b1d2f8bbfad85eab154b0c3268083f1916" Mar 19 09:30:24.747838 master-0 kubenswrapper[13205]: I0319 09:30:24.747802 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/0.log" Mar 19 09:30:24.747943 master-0 kubenswrapper[13205]: I0319 09:30:24.747913 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" event={"ID":"dc65ec1f-b8fb-40d6-ac39-46b255a33221","Type":"ContainerStarted","Data":"904c1bde5e53dae3086fb4db2d5212100fb16c3338576e000620100b07cbb80c"} Mar 19 09:30:24.750928 master-0 kubenswrapper[13205]: I0319 09:30:24.750877 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-r28hm_3374940a-612d-4335-8236-3ffe8d6e73a5/manager/0.log" Mar 19 09:30:24.751588 master-0 kubenswrapper[13205]: I0319 09:30:24.751473 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" event={"ID":"3374940a-612d-4335-8236-3ffe8d6e73a5","Type":"ContainerStarted","Data":"c26f43350a2cb85a07c0d0a3d3b58965155db3ef0f22f9a60e87bb4d8b1509c1"} Mar 19 09:30:24.751878 master-0 kubenswrapper[13205]: I0319 09:30:24.751826 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:30:25.769764 master-0 kubenswrapper[13205]: I0319 09:30:25.769709 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-r2cs7_9e10cb6e-5703-4e4d-a82b-f6de34888b65/config-sync-controllers/0.log" Mar 19 09:30:25.770845 master-0 kubenswrapper[13205]: I0319 09:30:25.770794 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-r2cs7_9e10cb6e-5703-4e4d-a82b-f6de34888b65/cluster-cloud-controller-manager/0.log" Mar 19 09:30:25.771032 master-0 kubenswrapper[13205]: I0319 09:30:25.770982 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" event={"ID":"9e10cb6e-5703-4e4d-a82b-f6de34888b65","Type":"ContainerStarted","Data":"e9e5f151ae07a8d3f803f43095e8b8d2db331e4c075845e08d68a336dcd95ca5"} Mar 19 09:30:25.771176 master-0 kubenswrapper[13205]: I0319 09:30:25.771035 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-r2cs7" event={"ID":"9e10cb6e-5703-4e4d-a82b-f6de34888b65","Type":"ContainerStarted","Data":"7b6d790a0aa44d493251572f79bbd68fd80c7b8d30eb2d936cd682c598fd73f4"} Mar 19 09:30:25.775405 master-0 kubenswrapper[13205]: I0319 09:30:25.775318 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" event={"ID":"dfbf2c4f-d1ac-4df7-b4ee-50001c617d4e","Type":"ContainerStarted","Data":"a8b009fa7b4a0536797bda0cf42011083a929ab6481efb70882ea9f155e75d16"} Mar 19 09:30:25.776010 master-0 kubenswrapper[13205]: I0319 09:30:25.775932 13205 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" containerID="cri-o://d4be4fce578d40cec533c0ab0b2ea7b1d2f8bbfad85eab154b0c3268083f1916" Mar 19 09:30:25.776010 master-0 kubenswrapper[13205]: I0319 09:30:25.775980 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:30:26.801846 master-0 kubenswrapper[13205]: I0319 09:30:26.801743 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:30:26.805730 master-0 kubenswrapper[13205]: I0319 09:30:26.805652 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-gxznr" Mar 19 09:30:27.144127 master-0 kubenswrapper[13205]: I0319 09:30:27.143884 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:30:27.144127 master-0 kubenswrapper[13205]: I0319 09:30:27.144030 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:30:30.524724 master-0 kubenswrapper[13205]: E0319 09:30:30.524590 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 19 09:30:30.638836 master-0 kubenswrapper[13205]: I0319 09:30:30.638621 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-r28hm" Mar 19 09:30:30.639779 master-0 kubenswrapper[13205]: I0319 09:30:30.639344 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-ft7tt" Mar 19 09:30:30.943994 master-0 kubenswrapper[13205]: E0319 09:30:30.943849 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:30:32.279132 master-0 kubenswrapper[13205]: I0319 09:30:32.279045 13205 status_manager.go:851] "Failed to get status for pod" podUID="24b4ed170d527099878cb5fdd508a2fb" pod="openshift-etcd/etcd-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods etcd-master-0)" Mar 19 09:30:32.985482 master-0 kubenswrapper[13205]: E0319 09:30:32.985418 13205 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 09:30:32.985482 master-0 kubenswrapper[13205]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4_0(4f6cf7391161f874391d2582b415d4e8e9df597e8c87dc9dd40397bb5139f174): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4f6cf7391161f874391d2582b415d4e8e9df597e8c87dc9dd40397bb5139f174" Netns:"/var/run/netns/452c2937-c7a4-45dd-8fa3-b84427d7da0e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=4f6cf7391161f874391d2582b415d4e8e9df597e8c87dc9dd40397bb5139f174;K8S_POD_UID=e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:30:32.985482 master-0 kubenswrapper[13205]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:30:32.985482 master-0 kubenswrapper[13205]: > Mar 19 09:30:32.985681 master-0 kubenswrapper[13205]: E0319 09:30:32.985507 13205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 09:30:32.985681 master-0 kubenswrapper[13205]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4_0(4f6cf7391161f874391d2582b415d4e8e9df597e8c87dc9dd40397bb5139f174): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4f6cf7391161f874391d2582b415d4e8e9df597e8c87dc9dd40397bb5139f174" Netns:"/var/run/netns/452c2937-c7a4-45dd-8fa3-b84427d7da0e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=4f6cf7391161f874391d2582b415d4e8e9df597e8c87dc9dd40397bb5139f174;K8S_POD_UID=e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:30:32.985681 master-0 kubenswrapper[13205]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:30:32.985681 master-0 kubenswrapper[13205]: > pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:30:32.985681 master-0 kubenswrapper[13205]: E0319 09:30:32.985554 13205 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 09:30:32.985681 master-0 kubenswrapper[13205]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4_0(4f6cf7391161f874391d2582b415d4e8e9df597e8c87dc9dd40397bb5139f174): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4f6cf7391161f874391d2582b415d4e8e9df597e8c87dc9dd40397bb5139f174" Netns:"/var/run/netns/452c2937-c7a4-45dd-8fa3-b84427d7da0e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=4f6cf7391161f874391d2582b415d4e8e9df597e8c87dc9dd40397bb5139f174;K8S_POD_UID=e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:30:32.985681 master-0 kubenswrapper[13205]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:30:32.985681 master-0 kubenswrapper[13205]: > pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:30:32.985681 master-0 kubenswrapper[13205]: E0319 09:30:32.985625 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-2-master-0_openshift-kube-controller-manager(e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-2-master-0_openshift-kube-controller-manager(e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4_0(4f6cf7391161f874391d2582b415d4e8e9df597e8c87dc9dd40397bb5139f174): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"4f6cf7391161f874391d2582b415d4e8e9df597e8c87dc9dd40397bb5139f174\\\" Netns:\\\"/var/run/netns/452c2937-c7a4-45dd-8fa3-b84427d7da0e\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=4f6cf7391161f874391d2582b415d4e8e9df597e8c87dc9dd40397bb5139f174;K8S_POD_UID=e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-controller-manager/installer-2-master-0" podUID="e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" Mar 19 09:30:33.885783 master-0 kubenswrapper[13205]: I0319 09:30:33.885707 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:30:33.886374 master-0 kubenswrapper[13205]: I0319 09:30:33.886308 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:30:35.228341 master-0 kubenswrapper[13205]: E0319 09:30:35.228239 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:30:37.144515 master-0 kubenswrapper[13205]: I0319 09:30:37.144431 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:30:37.145293 master-0 kubenswrapper[13205]: I0319 09:30:37.144504 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:30:40.486575 master-0 kubenswrapper[13205]: I0319 09:30:40.486511 13205 scope.go:117] "RemoveContainer" containerID="38188bacf1f255e1d32b674a6958ea273887bddfafc520e43e66c722c9e5e320" Mar 19 09:30:40.944592 master-0 kubenswrapper[13205]: E0319 09:30:40.944539 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:30:40.944592 master-0 kubenswrapper[13205]: E0319 09:30:40.944582 13205 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:30:46.926929 master-0 kubenswrapper[13205]: E0319 09:30:46.926698 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 09:30:47.143702 master-0 kubenswrapper[13205]: I0319 09:30:47.143653 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:30:47.143909 master-0 kubenswrapper[13205]: I0319 09:30:47.143707 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:30:47.929682 master-0 kubenswrapper[13205]: E0319 09:30:47.929424 13205 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e3407ebe36a8c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:28:31.857519244 +0000 UTC m=+297.189826132,LastTimestamp:2026-03-19 09:28:31.857519244 +0000 UTC m=+297.189826132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:30:51.036621 master-0 kubenswrapper[13205]: I0319 09:30:51.036542 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr42z" event={"ID":"741c9d25-7634-41c0-bfe4-b7a15de4b341","Type":"ContainerDied","Data":"38188bacf1f255e1d32b674a6958ea273887bddfafc520e43e66c722c9e5e320"} Mar 19 09:30:51.036621 master-0 kubenswrapper[13205]: I0319 09:30:51.036614 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38188bacf1f255e1d32b674a6958ea273887bddfafc520e43e66c722c9e5e320" Mar 19 09:30:54.063910 master-0 kubenswrapper[13205]: I0319 09:30:54.063860 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/1.log" Mar 19 09:30:54.067185 master-0 kubenswrapper[13205]: I0319 09:30:54.067120 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/0.log" Mar 19 09:30:54.067315 master-0 kubenswrapper[13205]: I0319 09:30:54.067226 13205 generic.go:334] "Generic (PLEG): container finished" podID="dc65ec1f-b8fb-40d6-ac39-46b255a33221" containerID="904c1bde5e53dae3086fb4db2d5212100fb16c3338576e000620100b07cbb80c" exitCode=1 Mar 19 09:30:54.067315 master-0 kubenswrapper[13205]: I0319 09:30:54.067274 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" event={"ID":"dc65ec1f-b8fb-40d6-ac39-46b255a33221","Type":"ContainerDied","Data":"904c1bde5e53dae3086fb4db2d5212100fb16c3338576e000620100b07cbb80c"} Mar 19 09:30:54.067451 master-0 kubenswrapper[13205]: I0319 09:30:54.067325 13205 scope.go:117] "RemoveContainer" containerID="5732fa3ff6aaea0289273acea825bfaab46efed575658d801c96fd54df3453e0" Mar 19 09:30:54.068145 master-0 kubenswrapper[13205]: I0319 09:30:54.068089 13205 scope.go:117] "RemoveContainer" containerID="904c1bde5e53dae3086fb4db2d5212100fb16c3338576e000620100b07cbb80c" Mar 19 09:30:54.068751 master-0 kubenswrapper[13205]: E0319 09:30:54.068680 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-v9s9c_openshift-cluster-storage-operator(dc65ec1f-b8fb-40d6-ac39-46b255a33221)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" podUID="dc65ec1f-b8fb-40d6-ac39-46b255a33221" Mar 19 09:30:54.692965 master-0 kubenswrapper[13205]: E0319 09:30:54.690949 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 09:30:55.078931 master-0 kubenswrapper[13205]: I0319 09:30:55.078847 13205 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="56803581b2ac5682333082d151539c605b3cfe4859b766e6bcb4221b8a45f07c" exitCode=0 Mar 19 09:30:55.080299 master-0 kubenswrapper[13205]: I0319 09:30:55.078927 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"56803581b2ac5682333082d151539c605b3cfe4859b766e6bcb4221b8a45f07c"} Mar 19 09:30:55.080299 master-0 kubenswrapper[13205]: I0319 09:30:55.079395 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="01f1f75a-c49b-4538-b629-e7911d4945f6" Mar 19 09:30:55.080299 master-0 kubenswrapper[13205]: I0319 09:30:55.079415 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="01f1f75a-c49b-4538-b629-e7911d4945f6" Mar 19 09:30:55.081371 master-0 kubenswrapper[13205]: I0319 09:30:55.081305 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/1.log" Mar 19 09:30:57.145258 master-0 kubenswrapper[13205]: I0319 09:30:57.145167 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:30:57.146113 master-0 kubenswrapper[13205]: I0319 09:30:57.145254 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:30:58.108261 master-0 kubenswrapper[13205]: I0319 09:30:58.108227 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-5jsnd_9a6c1523-e77c-4aac-814c-05d41215c42f/package-server-manager/0.log" Mar 19 09:30:58.109269 master-0 kubenswrapper[13205]: I0319 09:30:58.109238 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" event={"ID":"9a6c1523-e77c-4aac-814c-05d41215c42f","Type":"ContainerDied","Data":"6eac1964a5e72aa12c65bafd864366d67117587af3993a267a04a961b129a449"} Mar 19 09:30:58.109465 master-0 kubenswrapper[13205]: I0319 09:30:58.109206 13205 generic.go:334] "Generic (PLEG): container finished" podID="9a6c1523-e77c-4aac-814c-05d41215c42f" containerID="6eac1964a5e72aa12c65bafd864366d67117587af3993a267a04a961b129a449" exitCode=1 Mar 19 09:30:58.110075 master-0 kubenswrapper[13205]: I0319 09:30:58.110060 13205 scope.go:117] "RemoveContainer" containerID="6eac1964a5e72aa12c65bafd864366d67117587af3993a267a04a961b129a449" Mar 19 09:30:58.113742 master-0 kubenswrapper[13205]: I0319 09:30:58.113716 13205 generic.go:334] "Generic (PLEG): container finished" podID="41659a48-5eea-41cd-8b2a-b683dc15cc11" containerID="1e26847fb86eb6f61757cb6db0dd5524be844b8153caf7a191fdb5d34b73f968" exitCode=0 Mar 19 09:30:58.113875 master-0 kubenswrapper[13205]: I0319 09:30:58.113789 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" event={"ID":"41659a48-5eea-41cd-8b2a-b683dc15cc11","Type":"ContainerDied","Data":"1e26847fb86eb6f61757cb6db0dd5524be844b8153caf7a191fdb5d34b73f968"} Mar 19 09:30:58.114785 master-0 kubenswrapper[13205]: I0319 09:30:58.114769 13205 scope.go:117] "RemoveContainer" containerID="1e26847fb86eb6f61757cb6db0dd5524be844b8153caf7a191fdb5d34b73f968" Mar 19 09:30:58.116436 master-0 kubenswrapper[13205]: I0319 09:30:58.116385 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-v9n8l_9a0f93ac-a77b-488a-bcc4-a45702a9e32d/control-plane-machine-set-operator/0.log" Mar 19 09:30:58.116606 master-0 kubenswrapper[13205]: I0319 09:30:58.116589 13205 generic.go:334] "Generic (PLEG): container finished" podID="9a0f93ac-a77b-488a-bcc4-a45702a9e32d" containerID="a80231b434755caf6695f3beee3129592e0b9172da3c6519260dc101567b4d3d" exitCode=1 Mar 19 09:30:58.116728 master-0 kubenswrapper[13205]: I0319 09:30:58.116688 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" event={"ID":"9a0f93ac-a77b-488a-bcc4-a45702a9e32d","Type":"ContainerDied","Data":"a80231b434755caf6695f3beee3129592e0b9172da3c6519260dc101567b4d3d"} Mar 19 09:30:58.117315 master-0 kubenswrapper[13205]: I0319 09:30:58.117261 13205 scope.go:117] "RemoveContainer" containerID="a80231b434755caf6695f3beee3129592e0b9172da3c6519260dc101567b4d3d" Mar 19 09:30:59.130313 master-0 kubenswrapper[13205]: I0319 09:30:59.130224 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-5jsnd_9a6c1523-e77c-4aac-814c-05d41215c42f/package-server-manager/0.log" Mar 19 09:30:59.131993 master-0 kubenswrapper[13205]: I0319 09:30:59.131891 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" event={"ID":"9a6c1523-e77c-4aac-814c-05d41215c42f","Type":"ContainerStarted","Data":"edf28b5fe362fc4e737c0cbc17302c5e4eaea949b7176775a5ab0d2c27a7ac98"} Mar 19 09:30:59.132446 master-0 kubenswrapper[13205]: I0319 09:30:59.132349 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:30:59.137851 master-0 kubenswrapper[13205]: I0319 09:30:59.137773 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-hcnr7" event={"ID":"41659a48-5eea-41cd-8b2a-b683dc15cc11","Type":"ContainerStarted","Data":"cdbaacac9cad55bb07d4e6a2a4e30341b200b31a785b538fddad9aaf3cbd8926"} Mar 19 09:30:59.141992 master-0 kubenswrapper[13205]: I0319 09:30:59.141915 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-v9n8l_9a0f93ac-a77b-488a-bcc4-a45702a9e32d/control-plane-machine-set-operator/0.log" Mar 19 09:30:59.142161 master-0 kubenswrapper[13205]: I0319 09:30:59.142071 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-v9n8l" event={"ID":"9a0f93ac-a77b-488a-bcc4-a45702a9e32d","Type":"ContainerStarted","Data":"af0502fd78ddd8f0541adc196835fe856e0216f219776e6115983643f02fef70"} Mar 19 09:30:59.145857 master-0 kubenswrapper[13205]: I0319 09:30:59.145812 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nf2m5_87b757ff-ca45-4dc7-b31f-ccca53cb2354/cluster-baremetal-operator/0.log" Mar 19 09:30:59.146001 master-0 kubenswrapper[13205]: I0319 09:30:59.145893 13205 generic.go:334] "Generic (PLEG): container finished" podID="87b757ff-ca45-4dc7-b31f-ccca53cb2354" containerID="054cdc5e26bc9173015636411bf16495940c7ff09c3cacff260412b26b73df36" exitCode=1 Mar 19 09:30:59.146001 master-0 kubenswrapper[13205]: I0319 09:30:59.145957 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" event={"ID":"87b757ff-ca45-4dc7-b31f-ccca53cb2354","Type":"ContainerDied","Data":"054cdc5e26bc9173015636411bf16495940c7ff09c3cacff260412b26b73df36"} Mar 19 09:30:59.146889 master-0 kubenswrapper[13205]: I0319 09:30:59.146822 13205 scope.go:117] "RemoveContainer" containerID="054cdc5e26bc9173015636411bf16495940c7ff09c3cacff260412b26b73df36" Mar 19 09:31:00.159801 master-0 kubenswrapper[13205]: I0319 09:31:00.159730 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nf2m5_87b757ff-ca45-4dc7-b31f-ccca53cb2354/cluster-baremetal-operator/0.log" Mar 19 09:31:00.160895 master-0 kubenswrapper[13205]: I0319 09:31:00.159929 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nf2m5" event={"ID":"87b757ff-ca45-4dc7-b31f-ccca53cb2354","Type":"ContainerStarted","Data":"2c8f9c575747f64a09ee2dedd75832ae4c3859f1b354cf74d26d6323abf27b3d"} Mar 19 09:31:00.954920 master-0 kubenswrapper[13205]: E0319 09:31:00.954301 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:30:50Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:30:50Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:30:50Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:30:50Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ddc5283caf2ced75a94ddf0e8a43c431889692007e8a875a187b25c35b45a9e2\\\"],\\\"sizeBytes\\\":2895807090},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2a86d5923559588380116772739510b0a665d181819fddbf855acf63cecadb32\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:3721fab205c02b53b35057522b1ebb89ac3643d000d1fc2418aece7d395f7627\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746376668},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c9f7bbe4799eaacbfbb60eb906000d7a813a580d6a9740def7da774cbc4cf859\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cde1da53dadc54c24c10cab8fd3e67839ce68c33ec3b556c255a79167881966a\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252053726},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:9c6279da50a760828b0dabbd6e3baa384cadab3605c4d46e611ea749584e4c4a\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:cffdd23fb5aa53a255c309021bf3d4997520cb803392fa3b6aaa46563a46fb12\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1224180940},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bbb8535e2496de8389585ebbe696e7d7b9bad2b27785ad8a30a0fc683b0a22d\\\"],\\\"sizeBytes\\\":633877280},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98bf5467a01195e20aeea7d6f0b130ddacc00b73bc5312253b8c34e7208538f8\\\"],\\\"sizeBytes\\\":512235769},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:03.928253 master-0 kubenswrapper[13205]: E0319 09:31:03.928119 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 09:31:04.849740 master-0 kubenswrapper[13205]: I0319 09:31:04.849676 13205 scope.go:117] "RemoveContainer" containerID="904c1bde5e53dae3086fb4db2d5212100fb16c3338576e000620100b07cbb80c" Mar 19 09:31:05.206210 master-0 kubenswrapper[13205]: I0319 09:31:05.206053 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/1.log" Mar 19 09:31:05.206210 master-0 kubenswrapper[13205]: I0319 09:31:05.206133 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" event={"ID":"dc65ec1f-b8fb-40d6-ac39-46b255a33221","Type":"ContainerStarted","Data":"934e0f9ef0563dabbbd5cd5dea0a05f248ba5e8892487c996388569b54c255eb"} Mar 19 09:31:06.218018 master-0 kubenswrapper[13205]: I0319 09:31:06.217941 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-598fbc5f8f-smksb_9076d131-644a-4332-8a70-34f6b0f71575/cluster-node-tuning-operator/0.log" Mar 19 09:31:06.218708 master-0 kubenswrapper[13205]: I0319 09:31:06.218028 13205 generic.go:334] "Generic (PLEG): container finished" podID="9076d131-644a-4332-8a70-34f6b0f71575" containerID="4bddbe0181ee7be4a4759326c4ea480ecc4661debbd026d4925858f87d0a1138" exitCode=1 Mar 19 09:31:06.218708 master-0 kubenswrapper[13205]: I0319 09:31:06.218070 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" event={"ID":"9076d131-644a-4332-8a70-34f6b0f71575","Type":"ContainerDied","Data":"4bddbe0181ee7be4a4759326c4ea480ecc4661debbd026d4925858f87d0a1138"} Mar 19 09:31:06.218898 master-0 kubenswrapper[13205]: I0319 09:31:06.218852 13205 scope.go:117] "RemoveContainer" containerID="4bddbe0181ee7be4a4759326c4ea480ecc4661debbd026d4925858f87d0a1138" Mar 19 09:31:07.144389 master-0 kubenswrapper[13205]: I0319 09:31:07.144298 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:31:07.144389 master-0 kubenswrapper[13205]: I0319 09:31:07.144383 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:31:07.230119 master-0 kubenswrapper[13205]: I0319 09:31:07.230023 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-598fbc5f8f-smksb_9076d131-644a-4332-8a70-34f6b0f71575/cluster-node-tuning-operator/0.log" Mar 19 09:31:07.230981 master-0 kubenswrapper[13205]: I0319 09:31:07.230123 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-smksb" event={"ID":"9076d131-644a-4332-8a70-34f6b0f71575","Type":"ContainerStarted","Data":"cdf8199a4b300498ea2007c338e9d07b68cef6130d036a786e0376f10b93412c"} Mar 19 09:31:08.244889 master-0 kubenswrapper[13205]: I0319 09:31:08.244813 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5c6485487f-wdh89_e8ca673b-2a2f-4ecf-a142-7fe10fcac707/machine-approver-controller/0.log" Mar 19 09:31:08.246070 master-0 kubenswrapper[13205]: I0319 09:31:08.245968 13205 generic.go:334] "Generic (PLEG): container finished" podID="e8ca673b-2a2f-4ecf-a142-7fe10fcac707" containerID="6028345572a8ef50a1435ba40a05eb5b44cff27aa19ce4a85fcd0e6d16aac231" exitCode=255 Mar 19 09:31:08.246070 master-0 kubenswrapper[13205]: I0319 09:31:08.246054 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" event={"ID":"e8ca673b-2a2f-4ecf-a142-7fe10fcac707","Type":"ContainerDied","Data":"6028345572a8ef50a1435ba40a05eb5b44cff27aa19ce4a85fcd0e6d16aac231"} Mar 19 09:31:08.246754 master-0 kubenswrapper[13205]: I0319 09:31:08.246717 13205 scope.go:117] "RemoveContainer" containerID="6028345572a8ef50a1435ba40a05eb5b44cff27aa19ce4a85fcd0e6d16aac231" Mar 19 09:31:09.269614 master-0 kubenswrapper[13205]: I0319 09:31:09.269569 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5c6485487f-wdh89_e8ca673b-2a2f-4ecf-a142-7fe10fcac707/machine-approver-controller/0.log" Mar 19 09:31:09.270142 master-0 kubenswrapper[13205]: I0319 09:31:09.270019 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-wdh89" event={"ID":"e8ca673b-2a2f-4ecf-a142-7fe10fcac707","Type":"ContainerStarted","Data":"4c0f34f806484e068a6ec4e1e3c385eea73842f0edd7b39c48983959156b933b"} Mar 19 09:31:10.955392 master-0 kubenswrapper[13205]: E0319 09:31:10.955330 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:12.272689 master-0 kubenswrapper[13205]: I0319 09:31:12.272586 13205 patch_prober.go:28] interesting pod/openshift-kube-scheduler-master-0 container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.32.10:10259/healthz\": dial tcp 192.168.32.10:10259: connect: connection refused" start-of-body= Mar 19 09:31:12.273674 master-0 kubenswrapper[13205]: I0319 09:31:12.272691 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.32.10:10259/healthz\": dial tcp 192.168.32.10:10259: connect: connection refused" Mar 19 09:31:12.299568 master-0 kubenswrapper[13205]: I0319 09:31:12.299467 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/2.log" Mar 19 09:31:12.300860 master-0 kubenswrapper[13205]: I0319 09:31:12.300788 13205 generic.go:334] "Generic (PLEG): container finished" podID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerID="1cd8ba1cf946b8e03e8d14ad1a9ca15bc751df12a73a64e9d4a3982985753d17" exitCode=0 Mar 19 09:31:12.301046 master-0 kubenswrapper[13205]: I0319 09:31:12.300881 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerDied","Data":"1cd8ba1cf946b8e03e8d14ad1a9ca15bc751df12a73a64e9d4a3982985753d17"} Mar 19 09:31:12.302297 master-0 kubenswrapper[13205]: I0319 09:31:12.301740 13205 scope.go:117] "RemoveContainer" containerID="1cd8ba1cf946b8e03e8d14ad1a9ca15bc751df12a73a64e9d4a3982985753d17" Mar 19 09:31:12.304239 master-0 kubenswrapper[13205]: I0319 09:31:12.304199 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler/0.log" Mar 19 09:31:12.304725 master-0 kubenswrapper[13205]: I0319 09:31:12.304665 13205 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="57919871ecdce20adcf14d4b3e782688c40e27d380e27e5683da1cfdca89a184" exitCode=1 Mar 19 09:31:12.304725 master-0 kubenswrapper[13205]: I0319 09:31:12.304717 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerDied","Data":"57919871ecdce20adcf14d4b3e782688c40e27d380e27e5683da1cfdca89a184"} Mar 19 09:31:12.305204 master-0 kubenswrapper[13205]: I0319 09:31:12.305168 13205 scope.go:117] "RemoveContainer" containerID="57919871ecdce20adcf14d4b3e782688c40e27d380e27e5683da1cfdca89a184" Mar 19 09:31:13.317514 master-0 kubenswrapper[13205]: I0319 09:31:13.317436 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler/0.log" Mar 19 09:31:13.319173 master-0 kubenswrapper[13205]: I0319 09:31:13.319090 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"0a17e7848d06038a69e2540781de2a324d8067bd69c0598df08e190c706b5066"} Mar 19 09:31:13.319706 master-0 kubenswrapper[13205]: I0319 09:31:13.319664 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:31:13.324079 master-0 kubenswrapper[13205]: I0319 09:31:13.324039 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/2.log" Mar 19 09:31:13.325985 master-0 kubenswrapper[13205]: I0319 09:31:13.325946 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"bf923d539ddfaa5eb793064e80c576ddfb576443c11bed486cd618e8813816c5"} Mar 19 09:31:14.660561 master-0 kubenswrapper[13205]: I0319 09:31:14.660457 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:31:17.143997 master-0 kubenswrapper[13205]: I0319 09:31:17.143841 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:31:17.143997 master-0 kubenswrapper[13205]: I0319 09:31:17.143921 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:31:20.929742 master-0 kubenswrapper[13205]: E0319 09:31:20.929630 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 09:31:20.956917 master-0 kubenswrapper[13205]: E0319 09:31:20.956578 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:21.638501 master-0 kubenswrapper[13205]: I0319 09:31:21.638343 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:31:21.639953 master-0 kubenswrapper[13205]: I0319 09:31:21.639864 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" start-of-body= Mar 19 09:31:21.640099 master-0 kubenswrapper[13205]: I0319 09:31:21.640001 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Mar 19 09:31:21.933358 master-0 kubenswrapper[13205]: E0319 09:31:21.933057 13205 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e3407ebe4f82c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Killing,Message:Stopping container etcd-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:28:31.857621036 +0000 UTC m=+297.189927924,LastTimestamp:2026-03-19 09:28:31.857621036 +0000 UTC m=+297.189927924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:31:27.144715 master-0 kubenswrapper[13205]: I0319 09:31:27.144627 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:31:27.145766 master-0 kubenswrapper[13205]: I0319 09:31:27.144719 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:31:29.083309 master-0 kubenswrapper[13205]: E0319 09:31:29.083232 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 09:31:29.481714 master-0 kubenswrapper[13205]: I0319 09:31:29.481626 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"27caed1c83f735dcbc99f9bd36cdfed9ced2aa7b4769f1a360fb322a75e22617"} Mar 19 09:31:30.499835 master-0 kubenswrapper[13205]: I0319 09:31:30.499680 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"56d0f0688fbec7f436d6d07ce65ab2e10dd1824e3126acd396215fcb024c6bba"} Mar 19 09:31:30.499835 master-0 kubenswrapper[13205]: I0319 09:31:30.499746 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"0d06b2e5e75eb36dda03e09e2ee9bd063a97047ced4bccc2c29d004efa386804"} Mar 19 09:31:30.499835 master-0 kubenswrapper[13205]: I0319 09:31:30.499765 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"b0ba28821ce90f8dc1482f75f3419e88917905dc452d84bfd05912a5cab6fb84"} Mar 19 09:31:30.629605 master-0 kubenswrapper[13205]: I0319 09:31:30.628125 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-5jsnd" Mar 19 09:31:30.957362 master-0 kubenswrapper[13205]: E0319 09:31:30.957272 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:31.516797 master-0 kubenswrapper[13205]: I0319 09:31:31.516702 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"23f677dffcb898706781507a2d5c64332de91c2d8e980694672d6c67f00520b5"} Mar 19 09:31:31.518188 master-0 kubenswrapper[13205]: I0319 09:31:31.518118 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="01f1f75a-c49b-4538-b629-e7911d4945f6" Mar 19 09:31:31.518307 master-0 kubenswrapper[13205]: I0319 09:31:31.518221 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="01f1f75a-c49b-4538-b629-e7911d4945f6" Mar 19 09:31:31.640037 master-0 kubenswrapper[13205]: I0319 09:31:31.639932 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" start-of-body= Mar 19 09:31:31.640310 master-0 kubenswrapper[13205]: I0319 09:31:31.640035 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Mar 19 09:31:32.293230 master-0 kubenswrapper[13205]: I0319 09:31:32.293106 13205 status_manager.go:851] "Failed to get status for pod" podUID="1d5e311c-1c6a-4d5d-8c2b-493025593934" pod="openshift-controller-manager/controller-manager-6c8fd866bf-g46sj" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods controller-manager-6c8fd866bf-g46sj)" Mar 19 09:31:34.574825 master-0 kubenswrapper[13205]: E0319 09:31:34.574746 13205 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 09:31:34.574825 master-0 kubenswrapper[13205]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4_0(dc2891a0810d157ffb9dd68e3a160fa12119390a2ef20d6d5a730020017857ab): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"dc2891a0810d157ffb9dd68e3a160fa12119390a2ef20d6d5a730020017857ab" Netns:"/var/run/netns/a8061b83-60b3-461f-8213-00a59d08d6c3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=dc2891a0810d157ffb9dd68e3a160fa12119390a2ef20d6d5a730020017857ab;K8S_POD_UID=e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:31:34.574825 master-0 kubenswrapper[13205]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:31:34.574825 master-0 kubenswrapper[13205]: > Mar 19 09:31:34.575335 master-0 kubenswrapper[13205]: E0319 09:31:34.574870 13205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 09:31:34.575335 master-0 kubenswrapper[13205]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4_0(dc2891a0810d157ffb9dd68e3a160fa12119390a2ef20d6d5a730020017857ab): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"dc2891a0810d157ffb9dd68e3a160fa12119390a2ef20d6d5a730020017857ab" Netns:"/var/run/netns/a8061b83-60b3-461f-8213-00a59d08d6c3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=dc2891a0810d157ffb9dd68e3a160fa12119390a2ef20d6d5a730020017857ab;K8S_POD_UID=e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:31:34.575335 master-0 kubenswrapper[13205]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:31:34.575335 master-0 kubenswrapper[13205]: > pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:31:34.575335 master-0 kubenswrapper[13205]: E0319 09:31:34.574915 13205 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 09:31:34.575335 master-0 kubenswrapper[13205]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4_0(dc2891a0810d157ffb9dd68e3a160fa12119390a2ef20d6d5a730020017857ab): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"dc2891a0810d157ffb9dd68e3a160fa12119390a2ef20d6d5a730020017857ab" Netns:"/var/run/netns/a8061b83-60b3-461f-8213-00a59d08d6c3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=dc2891a0810d157ffb9dd68e3a160fa12119390a2ef20d6d5a730020017857ab;K8S_POD_UID=e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:31:34.575335 master-0 kubenswrapper[13205]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:31:34.575335 master-0 kubenswrapper[13205]: > pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:31:34.575335 master-0 kubenswrapper[13205]: E0319 09:31:34.575057 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-2-master-0_openshift-kube-controller-manager(e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-2-master-0_openshift-kube-controller-manager(e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4_0(dc2891a0810d157ffb9dd68e3a160fa12119390a2ef20d6d5a730020017857ab): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"dc2891a0810d157ffb9dd68e3a160fa12119390a2ef20d6d5a730020017857ab\\\" Netns:\\\"/var/run/netns/a8061b83-60b3-461f-8213-00a59d08d6c3\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=dc2891a0810d157ffb9dd68e3a160fa12119390a2ef20d6d5a730020017857ab;K8S_POD_UID=e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-controller-manager/installer-2-master-0" podUID="e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" Mar 19 09:31:34.883288 master-0 kubenswrapper[13205]: I0319 09:31:34.883078 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 19 09:31:34.883288 master-0 kubenswrapper[13205]: I0319 09:31:34.883181 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 19 09:31:35.205997 master-0 kubenswrapper[13205]: E0319 09:31:35.205868 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:31:35.553160 master-0 kubenswrapper[13205]: I0319 09:31:35.553122 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/2.log" Mar 19 09:31:35.554135 master-0 kubenswrapper[13205]: I0319 09:31:35.554091 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/1.log" Mar 19 09:31:35.554234 master-0 kubenswrapper[13205]: I0319 09:31:35.554155 13205 generic.go:334] "Generic (PLEG): container finished" podID="dc65ec1f-b8fb-40d6-ac39-46b255a33221" containerID="934e0f9ef0563dabbbd5cd5dea0a05f248ba5e8892487c996388569b54c255eb" exitCode=1 Mar 19 09:31:35.554234 master-0 kubenswrapper[13205]: I0319 09:31:35.554217 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:31:35.554384 master-0 kubenswrapper[13205]: I0319 09:31:35.554219 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" event={"ID":"dc65ec1f-b8fb-40d6-ac39-46b255a33221","Type":"ContainerDied","Data":"934e0f9ef0563dabbbd5cd5dea0a05f248ba5e8892487c996388569b54c255eb"} Mar 19 09:31:35.554384 master-0 kubenswrapper[13205]: I0319 09:31:35.554280 13205 scope.go:117] "RemoveContainer" containerID="904c1bde5e53dae3086fb4db2d5212100fb16c3338576e000620100b07cbb80c" Mar 19 09:31:35.554698 master-0 kubenswrapper[13205]: I0319 09:31:35.554656 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:31:35.555034 master-0 kubenswrapper[13205]: I0319 09:31:35.554996 13205 scope.go:117] "RemoveContainer" containerID="934e0f9ef0563dabbbd5cd5dea0a05f248ba5e8892487c996388569b54c255eb" Mar 19 09:31:35.555656 master-0 kubenswrapper[13205]: E0319 09:31:35.555285 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-v9s9c_openshift-cluster-storage-operator(dc65ec1f-b8fb-40d6-ac39-46b255a33221)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" podUID="dc65ec1f-b8fb-40d6-ac39-46b255a33221" Mar 19 09:31:36.567753 master-0 kubenswrapper[13205]: I0319 09:31:36.567519 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/2.log" Mar 19 09:31:37.144166 master-0 kubenswrapper[13205]: I0319 09:31:37.144070 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:31:37.144452 master-0 kubenswrapper[13205]: I0319 09:31:37.144193 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:31:37.930188 master-0 kubenswrapper[13205]: E0319 09:31:37.930112 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 09:31:40.958330 master-0 kubenswrapper[13205]: E0319 09:31:40.958227 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:40.958330 master-0 kubenswrapper[13205]: E0319 09:31:40.958291 13205 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:31:41.640188 master-0 kubenswrapper[13205]: I0319 09:31:41.640141 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" start-of-body= Mar 19 09:31:41.640505 master-0 kubenswrapper[13205]: I0319 09:31:41.640473 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Mar 19 09:31:41.640678 master-0 kubenswrapper[13205]: I0319 09:31:41.640661 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:31:41.641583 master-0 kubenswrapper[13205]: I0319 09:31:41.641556 13205 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"bf923d539ddfaa5eb793064e80c576ddfb576443c11bed486cd618e8813816c5"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 09:31:41.641797 master-0 kubenswrapper[13205]: I0319 09:31:41.641775 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" containerID="cri-o://bf923d539ddfaa5eb793064e80c576ddfb576443c11bed486cd618e8813816c5" gracePeriod=30 Mar 19 09:31:43.165507 master-0 kubenswrapper[13205]: I0319 09:31:43.165447 13205 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Mar 19 09:31:43.176517 master-0 kubenswrapper[13205]: W0319 09:31:43.176453 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode34f1e7e_9148_4fa7_8e5b_6b77ff2c62f4.slice/crio-d5662f02e0c233de962a55aa5a4b5f0bff1496b04c43b153ee06f94e129b37f3 WatchSource:0}: Error finding container d5662f02e0c233de962a55aa5a4b5f0bff1496b04c43b153ee06f94e129b37f3: Status 404 returned error can't find the container with id d5662f02e0c233de962a55aa5a4b5f0bff1496b04c43b153ee06f94e129b37f3 Mar 19 09:31:43.226375 master-0 kubenswrapper[13205]: I0319 09:31:43.226202 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:31:43.242265 master-0 kubenswrapper[13205]: I0319 09:31:43.242212 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 19 09:31:43.249391 master-0 kubenswrapper[13205]: I0319 09:31:43.249011 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:31:43.252351 master-0 kubenswrapper[13205]: I0319 09:31:43.252281 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79f67cdc89-bx72w" podStartSLOduration=192.840087203 podStartE2EDuration="3m18.252268366s" podCreationTimestamp="2026-03-19 09:28:25 +0000 UTC" firstStartedPulling="2026-03-19 09:28:30.145441687 +0000 UTC m=+295.477748605" lastFinishedPulling="2026-03-19 09:28:35.55762288 +0000 UTC m=+300.889929768" observedRunningTime="2026-03-19 09:31:43.189986463 +0000 UTC m=+488.522293351" watchObservedRunningTime="2026-03-19 09:31:43.252268366 +0000 UTC m=+488.584575264" Mar 19 09:31:43.386189 master-0 kubenswrapper[13205]: I0319 09:31:43.386118 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr"] Mar 19 09:31:43.396126 master-0 kubenswrapper[13205]: I0319 09:31:43.395171 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8555fbf585-9ggfr"] Mar 19 09:31:43.432369 master-0 kubenswrapper[13205]: I0319 09:31:43.432229 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-fd57cd489-6jmpf"] Mar 19 09:31:43.442600 master-0 kubenswrapper[13205]: I0319 09:31:43.438057 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-fd57cd489-6jmpf"] Mar 19 09:31:43.641137 master-0 kubenswrapper[13205]: I0319 09:31:43.641074 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:31:43.660200 master-0 kubenswrapper[13205]: I0319 09:31:43.658629 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c8fd866bf-g46sj"] Mar 19 09:31:43.666367 master-0 kubenswrapper[13205]: I0319 09:31:43.666287 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4","Type":"ContainerStarted","Data":"93aa177a299a996e489a2b97d047f9a2e5110bba02b78838ebfd5ecc32f7b998"} Mar 19 09:31:43.666367 master-0 kubenswrapper[13205]: I0319 09:31:43.666355 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4","Type":"ContainerStarted","Data":"d5662f02e0c233de962a55aa5a4b5f0bff1496b04c43b153ee06f94e129b37f3"} Mar 19 09:31:43.668372 master-0 kubenswrapper[13205]: I0319 09:31:43.668320 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c8fd866bf-g46sj"] Mar 19 09:31:43.744243 master-0 kubenswrapper[13205]: I0319 09:31:43.744095 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=0.744078666 podStartE2EDuration="744.078666ms" podCreationTimestamp="2026-03-19 09:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:31:43.740852635 +0000 UTC m=+489.073159533" watchObservedRunningTime="2026-03-19 09:31:43.744078666 +0000 UTC m=+489.076385554" Mar 19 09:31:43.760336 master-0 kubenswrapper[13205]: I0319 09:31:43.760241 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=193.760219005 podStartE2EDuration="3m13.760219005s" podCreationTimestamp="2026-03-19 09:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:31:43.753878238 +0000 UTC m=+489.086185146" watchObservedRunningTime="2026-03-19 09:31:43.760219005 +0000 UTC m=+489.092525913" Mar 19 09:31:44.676104 master-0 kubenswrapper[13205]: I0319 09:31:44.676001 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="01f1f75a-c49b-4538-b629-e7911d4945f6" Mar 19 09:31:44.676104 master-0 kubenswrapper[13205]: I0319 09:31:44.676078 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="01f1f75a-c49b-4538-b629-e7911d4945f6" Mar 19 09:31:44.867344 master-0 kubenswrapper[13205]: I0319 09:31:44.867278 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d5e311c-1c6a-4d5d-8c2b-493025593934" path="/var/lib/kubelet/pods/1d5e311c-1c6a-4d5d-8c2b-493025593934/volumes" Mar 19 09:31:44.868460 master-0 kubenswrapper[13205]: I0319 09:31:44.868417 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d66357-fcee-4e70-b563-5895b978ab55" path="/var/lib/kubelet/pods/67d66357-fcee-4e70-b563-5895b978ab55/volumes" Mar 19 09:31:44.869590 master-0 kubenswrapper[13205]: I0319 09:31:44.869498 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" path="/var/lib/kubelet/pods/e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5/volumes" Mar 19 09:31:44.917948 master-0 kubenswrapper[13205]: I0319 09:31:44.917860 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 19 09:31:45.694921 master-0 kubenswrapper[13205]: I0319 09:31:45.694852 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 19 09:31:47.144278 master-0 kubenswrapper[13205]: I0319 09:31:47.144230 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:31:47.144972 master-0 kubenswrapper[13205]: I0319 09:31:47.144942 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:31:48.849863 master-0 kubenswrapper[13205]: I0319 09:31:48.849806 13205 scope.go:117] "RemoveContainer" containerID="934e0f9ef0563dabbbd5cd5dea0a05f248ba5e8892487c996388569b54c255eb" Mar 19 09:31:48.850547 master-0 kubenswrapper[13205]: E0319 09:31:48.850019 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-v9s9c_openshift-cluster-storage-operator(dc65ec1f-b8fb-40d6-ac39-46b255a33221)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" podUID="dc65ec1f-b8fb-40d6-ac39-46b255a33221" Mar 19 09:31:57.144724 master-0 kubenswrapper[13205]: I0319 09:31:57.144620 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:31:57.145614 master-0 kubenswrapper[13205]: I0319 09:31:57.144722 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:32:00.848719 master-0 kubenswrapper[13205]: I0319 09:32:00.848651 13205 scope.go:117] "RemoveContainer" containerID="934e0f9ef0563dabbbd5cd5dea0a05f248ba5e8892487c996388569b54c255eb" Mar 19 09:32:01.036656 master-0 kubenswrapper[13205]: E0319 09:32:01.036383 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:31:51Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:31:51Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:31:51Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:31:51Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ddc5283caf2ced75a94ddf0e8a43c431889692007e8a875a187b25c35b45a9e2\\\"],\\\"sizeBytes\\\":2895807090},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2a86d5923559588380116772739510b0a665d181819fddbf855acf63cecadb32\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:3721fab205c02b53b35057522b1ebb89ac3643d000d1fc2418aece7d395f7627\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746376668},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c9f7bbe4799eaacbfbb60eb906000d7a813a580d6a9740def7da774cbc4cf859\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cde1da53dadc54c24c10cab8fd3e67839ce68c33ec3b556c255a79167881966a\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252053726},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:9c6279da50a760828b0dabbd6e3baa384cadab3605c4d46e611ea749584e4c4a\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:cffdd23fb5aa53a255c309021bf3d4997520cb803392fa3b6aaa46563a46fb12\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1224180940},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bbb8535e2496de8389585ebbe696e7d7b9bad2b27785ad8a30a0fc683b0a22d\\\"],\\\"sizeBytes\\\":633877280},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98bf5467a01195e20aeea7d6f0b130ddacc00b73bc5312253b8c34e7208538f8\\\"],\\\"sizeBytes\\\":512235769},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:32:01.811132 master-0 kubenswrapper[13205]: I0319 09:32:01.811062 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/2.log" Mar 19 09:32:01.811367 master-0 kubenswrapper[13205]: I0319 09:32:01.811153 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" event={"ID":"dc65ec1f-b8fb-40d6-ac39-46b255a33221","Type":"ContainerStarted","Data":"6ae902174b9d621c212682b70ad807c14a4e92599470fc3aaedbdc3e8e6191c4"} Mar 19 09:32:03.273958 master-0 kubenswrapper[13205]: I0319 09:32:03.273891 13205 patch_prober.go:28] interesting pod/openshift-kube-scheduler-master-0 container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:32:03.274862 master-0 kubenswrapper[13205]: I0319 09:32:03.274723 13205 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:32:07.145216 master-0 kubenswrapper[13205]: I0319 09:32:07.145127 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:32:07.146391 master-0 kubenswrapper[13205]: I0319 09:32:07.145243 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:32:07.597359 master-0 kubenswrapper[13205]: I0319 09:32:07.597218 13205 patch_prober.go:28] interesting pod/openshift-kube-scheduler-master-0 container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:32:07.599631 master-0 kubenswrapper[13205]: I0319 09:32:07.597360 13205 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:32:11.037421 master-0 kubenswrapper[13205]: E0319 09:32:11.037270 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:32:11.835564 master-0 kubenswrapper[13205]: E0319 09:32:11.835416 13205 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda474cbd3d0d9d7ed4d0ff461a5e5fe1a.slice/crio-conmon-bf923d539ddfaa5eb793064e80c576ddfb576443c11bed486cd618e8813816c5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda474cbd3d0d9d7ed4d0ff461a5e5fe1a.slice/crio-bf923d539ddfaa5eb793064e80c576ddfb576443c11bed486cd618e8813816c5.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:32:11.893936 master-0 kubenswrapper[13205]: I0319 09:32:11.893842 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/1.log" Mar 19 09:32:11.895135 master-0 kubenswrapper[13205]: I0319 09:32:11.895069 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/2.log" Mar 19 09:32:11.896730 master-0 kubenswrapper[13205]: I0319 09:32:11.896673 13205 generic.go:334] "Generic (PLEG): container finished" podID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerID="bf923d539ddfaa5eb793064e80c576ddfb576443c11bed486cd618e8813816c5" exitCode=137 Mar 19 09:32:11.896851 master-0 kubenswrapper[13205]: I0319 09:32:11.896731 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerDied","Data":"bf923d539ddfaa5eb793064e80c576ddfb576443c11bed486cd618e8813816c5"} Mar 19 09:32:11.896851 master-0 kubenswrapper[13205]: I0319 09:32:11.896810 13205 scope.go:117] "RemoveContainer" containerID="1cd8ba1cf946b8e03e8d14ad1a9ca15bc751df12a73a64e9d4a3982985753d17" Mar 19 09:32:12.911445 master-0 kubenswrapper[13205]: I0319 09:32:12.911353 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/1.log" Mar 19 09:32:12.913238 master-0 kubenswrapper[13205]: I0319 09:32:12.913160 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/2.log" Mar 19 09:32:12.914795 master-0 kubenswrapper[13205]: I0319 09:32:12.914713 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"111f6b2ee66d02abafb1557877f87d32e2a7993224942e1860bba961d2423e46"} Mar 19 09:32:13.139321 master-0 kubenswrapper[13205]: I0319 09:32:13.139243 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:32:14.660415 master-0 kubenswrapper[13205]: I0319 09:32:14.660297 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:32:17.144971 master-0 kubenswrapper[13205]: I0319 09:32:17.144858 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:32:17.144971 master-0 kubenswrapper[13205]: I0319 09:32:17.144934 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:32:21.038630 master-0 kubenswrapper[13205]: E0319 09:32:21.038554 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (get nodes master-0)" Mar 19 09:32:21.639187 master-0 kubenswrapper[13205]: I0319 09:32:21.639093 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:32:24.639573 master-0 kubenswrapper[13205]: I0319 09:32:24.639359 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:32:24.639573 master-0 kubenswrapper[13205]: I0319 09:32:24.639519 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:32:27.144055 master-0 kubenswrapper[13205]: I0319 09:32:27.143923 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:32:27.144055 master-0 kubenswrapper[13205]: I0319 09:32:27.144046 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:32:31.039133 master-0 kubenswrapper[13205]: E0319 09:32:31.039044 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:32:31.129347 master-0 kubenswrapper[13205]: I0319 09:32:31.129288 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/3.log" Mar 19 09:32:31.130218 master-0 kubenswrapper[13205]: I0319 09:32:31.130169 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/2.log" Mar 19 09:32:31.130349 master-0 kubenswrapper[13205]: I0319 09:32:31.130230 13205 generic.go:334] "Generic (PLEG): container finished" podID="dc65ec1f-b8fb-40d6-ac39-46b255a33221" containerID="6ae902174b9d621c212682b70ad807c14a4e92599470fc3aaedbdc3e8e6191c4" exitCode=1 Mar 19 09:32:31.130349 master-0 kubenswrapper[13205]: I0319 09:32:31.130267 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" event={"ID":"dc65ec1f-b8fb-40d6-ac39-46b255a33221","Type":"ContainerDied","Data":"6ae902174b9d621c212682b70ad807c14a4e92599470fc3aaedbdc3e8e6191c4"} Mar 19 09:32:31.130349 master-0 kubenswrapper[13205]: I0319 09:32:31.130306 13205 scope.go:117] "RemoveContainer" containerID="934e0f9ef0563dabbbd5cd5dea0a05f248ba5e8892487c996388569b54c255eb" Mar 19 09:32:31.130848 master-0 kubenswrapper[13205]: I0319 09:32:31.130795 13205 scope.go:117] "RemoveContainer" containerID="6ae902174b9d621c212682b70ad807c14a4e92599470fc3aaedbdc3e8e6191c4" Mar 19 09:32:31.131096 master-0 kubenswrapper[13205]: E0319 09:32:31.131056 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-v9s9c_openshift-cluster-storage-operator(dc65ec1f-b8fb-40d6-ac39-46b255a33221)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" podUID="dc65ec1f-b8fb-40d6-ac39-46b255a33221" Mar 19 09:32:32.137631 master-0 kubenswrapper[13205]: I0319 09:32:32.137581 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/3.log" Mar 19 09:32:34.638970 master-0 kubenswrapper[13205]: I0319 09:32:34.638845 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:32:34.639794 master-0 kubenswrapper[13205]: I0319 09:32:34.638985 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:32:35.204231 master-0 kubenswrapper[13205]: E0319 09:32:35.204188 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:32:36.172783 master-0 kubenswrapper[13205]: I0319 09:32:36.172706 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/1.log" Mar 19 09:32:36.173486 master-0 kubenswrapper[13205]: I0319 09:32:36.173441 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/3.log" Mar 19 09:32:36.174109 master-0 kubenswrapper[13205]: I0319 09:32:36.174082 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/2.log" Mar 19 09:32:36.176054 master-0 kubenswrapper[13205]: I0319 09:32:36.175982 13205 generic.go:334] "Generic (PLEG): container finished" podID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerID="376660e439ae8f2c02ca4c362c919c890d0ad33fc46053d8caa8f4c5abc8c5d8" exitCode=1 Mar 19 09:32:36.176111 master-0 kubenswrapper[13205]: I0319 09:32:36.176023 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerDied","Data":"376660e439ae8f2c02ca4c362c919c890d0ad33fc46053d8caa8f4c5abc8c5d8"} Mar 19 09:32:36.176253 master-0 kubenswrapper[13205]: I0319 09:32:36.176183 13205 scope.go:117] "RemoveContainer" containerID="3694e4ab2ea2f543365f25e3f482176aa9345099d6e0f60c0e896413215ced6f" Mar 19 09:32:36.176830 master-0 kubenswrapper[13205]: I0319 09:32:36.176800 13205 scope.go:117] "RemoveContainer" containerID="376660e439ae8f2c02ca4c362c919c890d0ad33fc46053d8caa8f4c5abc8c5d8" Mar 19 09:32:36.177193 master-0 kubenswrapper[13205]: E0319 09:32:36.177131 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:32:37.144642 master-0 kubenswrapper[13205]: I0319 09:32:37.144556 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:32:37.144642 master-0 kubenswrapper[13205]: I0319 09:32:37.144624 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:32:37.191001 master-0 kubenswrapper[13205]: I0319 09:32:37.190868 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/1.log" Mar 19 09:32:37.192144 master-0 kubenswrapper[13205]: I0319 09:32:37.191922 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/3.log" Mar 19 09:32:37.630851 master-0 kubenswrapper[13205]: I0319 09:32:37.630760 13205 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:32:37.631959 master-0 kubenswrapper[13205]: I0319 09:32:37.631900 13205 scope.go:117] "RemoveContainer" containerID="376660e439ae8f2c02ca4c362c919c890d0ad33fc46053d8caa8f4c5abc8c5d8" Mar 19 09:32:37.632497 master-0 kubenswrapper[13205]: E0319 09:32:37.632425 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:32:41.040127 master-0 kubenswrapper[13205]: E0319 09:32:41.040055 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:32:41.040127 master-0 kubenswrapper[13205]: E0319 09:32:41.040107 13205 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:32:41.849761 master-0 kubenswrapper[13205]: I0319 09:32:41.849622 13205 scope.go:117] "RemoveContainer" containerID="6ae902174b9d621c212682b70ad807c14a4e92599470fc3aaedbdc3e8e6191c4" Mar 19 09:32:41.850160 master-0 kubenswrapper[13205]: E0319 09:32:41.850090 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-v9s9c_openshift-cluster-storage-operator(dc65ec1f-b8fb-40d6-ac39-46b255a33221)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" podUID="dc65ec1f-b8fb-40d6-ac39-46b255a33221" Mar 19 09:32:44.012520 master-0 kubenswrapper[13205]: I0319 09:32:44.012432 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:32:44.013478 master-0 kubenswrapper[13205]: I0319 09:32:44.013203 13205 scope.go:117] "RemoveContainer" containerID="376660e439ae8f2c02ca4c362c919c890d0ad33fc46053d8caa8f4c5abc8c5d8" Mar 19 09:32:44.014046 master-0 kubenswrapper[13205]: E0319 09:32:44.013698 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:32:44.639438 master-0 kubenswrapper[13205]: I0319 09:32:44.639356 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:32:44.639795 master-0 kubenswrapper[13205]: I0319 09:32:44.639458 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:32:44.639795 master-0 kubenswrapper[13205]: I0319 09:32:44.639599 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:32:44.640514 master-0 kubenswrapper[13205]: I0319 09:32:44.640463 13205 scope.go:117] "RemoveContainer" containerID="376660e439ae8f2c02ca4c362c919c890d0ad33fc46053d8caa8f4c5abc8c5d8" Mar 19 09:32:44.640692 master-0 kubenswrapper[13205]: I0319 09:32:44.640623 13205 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"111f6b2ee66d02abafb1557877f87d32e2a7993224942e1860bba961d2423e46"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 09:32:44.640808 master-0 kubenswrapper[13205]: I0319 09:32:44.640768 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" containerID="cri-o://111f6b2ee66d02abafb1557877f87d32e2a7993224942e1860bba961d2423e46" gracePeriod=30 Mar 19 09:32:44.840797 master-0 kubenswrapper[13205]: I0319 09:32:44.840705 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:32:47.145297 master-0 kubenswrapper[13205]: I0319 09:32:47.145234 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:32:47.145882 master-0 kubenswrapper[13205]: I0319 09:32:47.145303 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:32:56.849467 master-0 kubenswrapper[13205]: I0319 09:32:56.849415 13205 scope.go:117] "RemoveContainer" containerID="6ae902174b9d621c212682b70ad807c14a4e92599470fc3aaedbdc3e8e6191c4" Mar 19 09:32:56.850082 master-0 kubenswrapper[13205]: E0319 09:32:56.849659 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-v9s9c_openshift-cluster-storage-operator(dc65ec1f-b8fb-40d6-ac39-46b255a33221)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" podUID="dc65ec1f-b8fb-40d6-ac39-46b255a33221" Mar 19 09:32:57.145309 master-0 kubenswrapper[13205]: I0319 09:32:57.144928 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:32:57.145309 master-0 kubenswrapper[13205]: I0319 09:32:57.145033 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:33:01.128657 master-0 kubenswrapper[13205]: E0319 09:33:01.128341 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:32:51Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:32:51Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:32:51Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:32:51Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ddc5283caf2ced75a94ddf0e8a43c431889692007e8a875a187b25c35b45a9e2\\\"],\\\"sizeBytes\\\":2895807090},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2a86d5923559588380116772739510b0a665d181819fddbf855acf63cecadb32\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:3721fab205c02b53b35057522b1ebb89ac3643d000d1fc2418aece7d395f7627\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746376668},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c9f7bbe4799eaacbfbb60eb906000d7a813a580d6a9740def7da774cbc4cf859\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cde1da53dadc54c24c10cab8fd3e67839ce68c33ec3b556c255a79167881966a\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252053726},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:9c6279da50a760828b0dabbd6e3baa384cadab3605c4d46e611ea749584e4c4a\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:cffdd23fb5aa53a255c309021bf3d4997520cb803392fa3b6aaa46563a46fb12\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1224180940},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bbb8535e2496de8389585ebbe696e7d7b9bad2b27785ad8a30a0fc683b0a22d\\\"],\\\"sizeBytes\\\":633877280},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98bf5467a01195e20aeea7d6f0b130ddacc00b73bc5312253b8c34e7208538f8\\\"],\\\"sizeBytes\\\":512235769},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:33:04.888918 master-0 kubenswrapper[13205]: I0319 09:33:04.888764 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 09:33:04.890373 master-0 kubenswrapper[13205]: E0319 09:33:04.890337 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b149c739-203d-4f5a-af11-dba6835ed71d" containerName="installer" Mar 19 09:33:04.890579 master-0 kubenswrapper[13205]: I0319 09:33:04.890522 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="b149c739-203d-4f5a-af11-dba6835ed71d" containerName="installer" Mar 19 09:33:04.890797 master-0 kubenswrapper[13205]: E0319 09:33:04.890771 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d5e311c-1c6a-4d5d-8c2b-493025593934" containerName="controller-manager" Mar 19 09:33:04.890945 master-0 kubenswrapper[13205]: I0319 09:33:04.890920 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d5e311c-1c6a-4d5d-8c2b-493025593934" containerName="controller-manager" Mar 19 09:33:04.891150 master-0 kubenswrapper[13205]: E0319 09:33:04.891125 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" containerName="oauth-openshift" Mar 19 09:33:04.891297 master-0 kubenswrapper[13205]: I0319 09:33:04.891274 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" containerName="oauth-openshift" Mar 19 09:33:04.891476 master-0 kubenswrapper[13205]: E0319 09:33:04.891452 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d66357-fcee-4e70-b563-5895b978ab55" containerName="route-controller-manager" Mar 19 09:33:04.891657 master-0 kubenswrapper[13205]: I0319 09:33:04.891632 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d66357-fcee-4e70-b563-5895b978ab55" containerName="route-controller-manager" Mar 19 09:33:04.892106 master-0 kubenswrapper[13205]: I0319 09:33:04.892076 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="b149c739-203d-4f5a-af11-dba6835ed71d" containerName="installer" Mar 19 09:33:04.892318 master-0 kubenswrapper[13205]: I0319 09:33:04.892294 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d66357-fcee-4e70-b563-5895b978ab55" containerName="route-controller-manager" Mar 19 09:33:04.892477 master-0 kubenswrapper[13205]: I0319 09:33:04.892454 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d5e311c-1c6a-4d5d-8c2b-493025593934" containerName="controller-manager" Mar 19 09:33:04.892677 master-0 kubenswrapper[13205]: I0319 09:33:04.892650 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="e199b2f1-71c1-40ff-b4f7-4bbbb66ed9f5" containerName="oauth-openshift" Mar 19 09:33:04.894202 master-0 kubenswrapper[13205]: I0319 09:33:04.894133 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:04.905159 master-0 kubenswrapper[13205]: I0319 09:33:04.898926 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 09:33:04.905159 master-0 kubenswrapper[13205]: I0319 09:33:04.899009 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-qzwhq" Mar 19 09:33:04.908245 master-0 kubenswrapper[13205]: I0319 09:33:04.908180 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 09:33:05.062069 master-0 kubenswrapper[13205]: I0319 09:33:05.062002 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f84f3f5a-d1da-4e9e-bfae-e6264c751372-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:05.062331 master-0 kubenswrapper[13205]: I0319 09:33:05.062173 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f84f3f5a-d1da-4e9e-bfae-e6264c751372-kube-api-access\") pod \"installer-4-master-0\" (UID: \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:05.062408 master-0 kubenswrapper[13205]: I0319 09:33:05.062366 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f84f3f5a-d1da-4e9e-bfae-e6264c751372-var-lock\") pod \"installer-4-master-0\" (UID: \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:05.164069 master-0 kubenswrapper[13205]: I0319 09:33:05.163861 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f84f3f5a-d1da-4e9e-bfae-e6264c751372-kube-api-access\") pod \"installer-4-master-0\" (UID: \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:05.164069 master-0 kubenswrapper[13205]: I0319 09:33:05.163972 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f84f3f5a-d1da-4e9e-bfae-e6264c751372-var-lock\") pod \"installer-4-master-0\" (UID: \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:05.164069 master-0 kubenswrapper[13205]: I0319 09:33:05.164024 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f84f3f5a-d1da-4e9e-bfae-e6264c751372-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:05.164461 master-0 kubenswrapper[13205]: I0319 09:33:05.164110 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f84f3f5a-d1da-4e9e-bfae-e6264c751372-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:05.164461 master-0 kubenswrapper[13205]: I0319 09:33:05.164455 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f84f3f5a-d1da-4e9e-bfae-e6264c751372-var-lock\") pod \"installer-4-master-0\" (UID: \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:05.184124 master-0 kubenswrapper[13205]: I0319 09:33:05.184025 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f84f3f5a-d1da-4e9e-bfae-e6264c751372-kube-api-access\") pod \"installer-4-master-0\" (UID: \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:05.211785 master-0 kubenswrapper[13205]: I0319 09:33:05.211701 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:05.626378 master-0 kubenswrapper[13205]: I0319 09:33:05.626326 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 09:33:06.438219 master-0 kubenswrapper[13205]: I0319 09:33:06.438163 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/0.log" Mar 19 09:33:06.439210 master-0 kubenswrapper[13205]: I0319 09:33:06.438772 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler/0.log" Mar 19 09:33:06.439210 master-0 kubenswrapper[13205]: I0319 09:33:06.439057 13205 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="6d8e777ee2c690477b890e212d15377f6f78a023a47f6d1ccdb66d4fd4236c20" exitCode=1 Mar 19 09:33:06.439210 master-0 kubenswrapper[13205]: I0319 09:33:06.439109 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerDied","Data":"6d8e777ee2c690477b890e212d15377f6f78a023a47f6d1ccdb66d4fd4236c20"} Mar 19 09:33:06.440198 master-0 kubenswrapper[13205]: I0319 09:33:06.440147 13205 scope.go:117] "RemoveContainer" containerID="6d8e777ee2c690477b890e212d15377f6f78a023a47f6d1ccdb66d4fd4236c20" Mar 19 09:33:06.440648 master-0 kubenswrapper[13205]: I0319 09:33:06.440616 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"f84f3f5a-d1da-4e9e-bfae-e6264c751372","Type":"ContainerStarted","Data":"3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8"} Mar 19 09:33:06.440699 master-0 kubenswrapper[13205]: I0319 09:33:06.440658 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"f84f3f5a-d1da-4e9e-bfae-e6264c751372","Type":"ContainerStarted","Data":"f31a0b1f8eafa06f8d2cd14f7ac5f32f093e33ea1d360285debc9fd4a89b85a0"} Mar 19 09:33:06.498991 master-0 kubenswrapper[13205]: I0319 09:33:06.498576 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=2.498548377 podStartE2EDuration="2.498548377s" podCreationTimestamp="2026-03-19 09:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:33:06.48814039 +0000 UTC m=+571.820447368" watchObservedRunningTime="2026-03-19 09:33:06.498548377 +0000 UTC m=+571.830855305" Mar 19 09:33:07.144428 master-0 kubenswrapper[13205]: I0319 09:33:07.144365 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:33:07.144714 master-0 kubenswrapper[13205]: I0319 09:33:07.144439 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:33:07.451163 master-0 kubenswrapper[13205]: I0319 09:33:07.451037 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/0.log" Mar 19 09:33:07.451746 master-0 kubenswrapper[13205]: I0319 09:33:07.451616 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler/0.log" Mar 19 09:33:07.453728 master-0 kubenswrapper[13205]: I0319 09:33:07.453698 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"96f501d33ba99906fcc67f343ffb6c0314c555d3c6113a843511ffa7ed7f311a"} Mar 19 09:33:07.456418 master-0 kubenswrapper[13205]: I0319 09:33:07.456393 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/1.log" Mar 19 09:33:07.457083 master-0 kubenswrapper[13205]: I0319 09:33:07.457067 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/3.log" Mar 19 09:33:07.458187 master-0 kubenswrapper[13205]: I0319 09:33:07.458152 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager-cert-syncer/0.log" Mar 19 09:33:07.458275 master-0 kubenswrapper[13205]: I0319 09:33:07.458224 13205 generic.go:334] "Generic (PLEG): container finished" podID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerID="d2773f59c5e5fc7c4c20d27964b8855d429ffb69ddd44594d1e039aab3c6d9c7" exitCode=1 Mar 19 09:33:07.458605 master-0 kubenswrapper[13205]: I0319 09:33:07.458581 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerDied","Data":"d2773f59c5e5fc7c4c20d27964b8855d429ffb69ddd44594d1e039aab3c6d9c7"} Mar 19 09:33:11.129792 master-0 kubenswrapper[13205]: E0319 09:33:11.129702 13205 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:33:11.849732 master-0 kubenswrapper[13205]: I0319 09:33:11.849598 13205 scope.go:117] "RemoveContainer" containerID="6ae902174b9d621c212682b70ad807c14a4e92599470fc3aaedbdc3e8e6191c4" Mar 19 09:33:12.495622 master-0 kubenswrapper[13205]: I0319 09:33:12.495574 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/3.log" Mar 19 09:33:12.496244 master-0 kubenswrapper[13205]: I0319 09:33:12.495646 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-v9s9c" event={"ID":"dc65ec1f-b8fb-40d6-ac39-46b255a33221","Type":"ContainerStarted","Data":"cbbb2e85ef69c59ffe800a769aa231445d2f97eccc4a6414deeb2c05d99fe70b"} Mar 19 09:33:13.517892 master-0 kubenswrapper[13205]: I0319 09:33:13.517849 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/2.log" Mar 19 09:33:13.518393 master-0 kubenswrapper[13205]: I0319 09:33:13.518351 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/1.log" Mar 19 09:33:13.519148 master-0 kubenswrapper[13205]: I0319 09:33:13.519116 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/3.log" Mar 19 09:33:13.521361 master-0 kubenswrapper[13205]: I0319 09:33:13.521328 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager-cert-syncer/0.log" Mar 19 09:33:13.521431 master-0 kubenswrapper[13205]: I0319 09:33:13.521377 13205 generic.go:334] "Generic (PLEG): container finished" podID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerID="111f6b2ee66d02abafb1557877f87d32e2a7993224942e1860bba961d2423e46" exitCode=255 Mar 19 09:33:13.521431 master-0 kubenswrapper[13205]: I0319 09:33:13.521406 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerDied","Data":"111f6b2ee66d02abafb1557877f87d32e2a7993224942e1860bba961d2423e46"} Mar 19 09:33:13.521493 master-0 kubenswrapper[13205]: I0319 09:33:13.521435 13205 scope.go:117] "RemoveContainer" containerID="bf923d539ddfaa5eb793064e80c576ddfb576443c11bed486cd618e8813816c5" Mar 19 09:33:13.594108 master-0 kubenswrapper[13205]: E0319 09:33:13.594065 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:33:14.535136 master-0 kubenswrapper[13205]: I0319 09:33:14.535033 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/2.log" Mar 19 09:33:14.536749 master-0 kubenswrapper[13205]: I0319 09:33:14.536689 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/3.log" Mar 19 09:33:14.538617 master-0 kubenswrapper[13205]: I0319 09:33:14.538512 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager-cert-syncer/0.log" Mar 19 09:33:14.538778 master-0 kubenswrapper[13205]: I0319 09:33:14.538672 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"b9c3e8b758bdc9a75844b0278cf27a3810e791645794ec44f1bef75175922fcf"} Mar 19 09:33:14.539610 master-0 kubenswrapper[13205]: I0319 09:33:14.539573 13205 scope.go:117] "RemoveContainer" containerID="376660e439ae8f2c02ca4c362c919c890d0ad33fc46053d8caa8f4c5abc8c5d8" Mar 19 09:33:14.539794 master-0 kubenswrapper[13205]: I0319 09:33:14.539667 13205 scope.go:117] "RemoveContainer" containerID="d2773f59c5e5fc7c4c20d27964b8855d429ffb69ddd44594d1e039aab3c6d9c7" Mar 19 09:33:14.660413 master-0 kubenswrapper[13205]: I0319 09:33:14.660340 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:14.865234 master-0 kubenswrapper[13205]: E0319 09:33:14.864780 13205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(a474cbd3d0d9d7ed4d0ff461a5e5fe1a)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" Mar 19 09:33:15.557781 master-0 kubenswrapper[13205]: I0319 09:33:15.557689 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/2.log" Mar 19 09:33:15.558727 master-0 kubenswrapper[13205]: I0319 09:33:15.558556 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/3.log" Mar 19 09:33:15.559879 master-0 kubenswrapper[13205]: I0319 09:33:15.559816 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager-cert-syncer/0.log" Mar 19 09:33:15.560044 master-0 kubenswrapper[13205]: I0319 09:33:15.559927 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"826550ebd0b1d6be98355aa6c853b794d4484edfacd28e4eb43f5eadc79826f2"} Mar 19 09:33:15.560728 master-0 kubenswrapper[13205]: I0319 09:33:15.560677 13205 scope.go:117] "RemoveContainer" containerID="376660e439ae8f2c02ca4c362c919c890d0ad33fc46053d8caa8f4c5abc8c5d8" Mar 19 09:33:16.572994 master-0 kubenswrapper[13205]: I0319 09:33:16.572907 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/2.log" Mar 19 09:33:16.573888 master-0 kubenswrapper[13205]: I0319 09:33:16.573637 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/3.log" Mar 19 09:33:16.574744 master-0 kubenswrapper[13205]: I0319 09:33:16.574697 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager-cert-syncer/0.log" Mar 19 09:33:16.574897 master-0 kubenswrapper[13205]: I0319 09:33:16.574759 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a474cbd3d0d9d7ed4d0ff461a5e5fe1a","Type":"ContainerStarted","Data":"6586d1ef1d8b389dab4a4ad49608dcd75cf745858d90f13a98e0648bcd092731"} Mar 19 09:33:17.144428 master-0 kubenswrapper[13205]: I0319 09:33:17.144391 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:33:17.144793 master-0 kubenswrapper[13205]: I0319 09:33:17.144766 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:33:21.255304 master-0 kubenswrapper[13205]: I0319 09:33:21.255210 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:33:21.256369 master-0 kubenswrapper[13205]: I0319 09:33:21.255575 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://e7ac13cba0a41afefd1f1913bc7aba4a187c6d99752100ec1e36b10b44ac9c6a" gracePeriod=30 Mar 19 09:33:21.256369 master-0 kubenswrapper[13205]: I0319 09:33:21.255734 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://826550ebd0b1d6be98355aa6c853b794d4484edfacd28e4eb43f5eadc79826f2" gracePeriod=30 Mar 19 09:33:21.256369 master-0 kubenswrapper[13205]: I0319 09:33:21.255734 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" containerID="cri-o://b9c3e8b758bdc9a75844b0278cf27a3810e791645794ec44f1bef75175922fcf" gracePeriod=30 Mar 19 09:33:21.256369 master-0 kubenswrapper[13205]: I0319 09:33:21.255778 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" containerID="cri-o://6586d1ef1d8b389dab4a4ad49608dcd75cf745858d90f13a98e0648bcd092731" gracePeriod=30 Mar 19 09:33:21.257781 master-0 kubenswrapper[13205]: I0319 09:33:21.257655 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: E0319 09:33:21.257971 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: I0319 09:33:21.257993 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: E0319 09:33:21.258010 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: I0319 09:33:21.258018 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: E0319 09:33:21.258029 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager-cert-syncer" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: I0319 09:33:21.258040 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager-cert-syncer" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: E0319 09:33:21.258053 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: I0319 09:33:21.258062 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: E0319 09:33:21.258076 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: I0319 09:33:21.258084 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: E0319 09:33:21.258104 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: I0319 09:33:21.258113 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: E0319 09:33:21.258122 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: I0319 09:33:21.258130 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" Mar 19 09:33:21.258113 master-0 kubenswrapper[13205]: E0319 09:33:21.258157 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager-recovery-controller" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258166 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager-recovery-controller" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258321 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258338 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258356 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258371 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258387 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258400 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258416 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258428 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258436 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager-recovery-controller" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258445 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager-cert-syncer" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258457 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager-cert-syncer" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258476 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: E0319 09:33:21.258668 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258683 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: E0319 09:33:21.258696 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager-cert-syncer" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258706 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager-cert-syncer" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: E0319 09:33:21.258724 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258734 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="kube-controller-manager" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: E0319 09:33:21.258786 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" Mar 19 09:33:21.259668 master-0 kubenswrapper[13205]: I0319 09:33:21.258799 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerName="cluster-policy-controller" Mar 19 09:33:21.405993 master-0 kubenswrapper[13205]: I0319 09:33:21.405952 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/01ef8262f7214653fecba11f5aa7ce13-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"01ef8262f7214653fecba11f5aa7ce13\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:21.406282 master-0 kubenswrapper[13205]: I0319 09:33:21.406121 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/01ef8262f7214653fecba11f5aa7ce13-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"01ef8262f7214653fecba11f5aa7ce13\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:21.450676 master-0 kubenswrapper[13205]: I0319 09:33:21.450623 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager-cert-syncer/1.log" Mar 19 09:33:21.452732 master-0 kubenswrapper[13205]: I0319 09:33:21.452701 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/2.log" Mar 19 09:33:21.453518 master-0 kubenswrapper[13205]: I0319 09:33:21.453473 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/3.log" Mar 19 09:33:21.464797 master-0 kubenswrapper[13205]: I0319 09:33:21.462030 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager-cert-syncer/0.log" Mar 19 09:33:21.464797 master-0 kubenswrapper[13205]: I0319 09:33:21.462193 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:21.469066 master-0 kubenswrapper[13205]: I0319 09:33:21.468996 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" podUID="01ef8262f7214653fecba11f5aa7ce13" Mar 19 09:33:21.507175 master-0 kubenswrapper[13205]: I0319 09:33:21.507062 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/01ef8262f7214653fecba11f5aa7ce13-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"01ef8262f7214653fecba11f5aa7ce13\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:21.507346 master-0 kubenswrapper[13205]: I0319 09:33:21.507149 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/01ef8262f7214653fecba11f5aa7ce13-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"01ef8262f7214653fecba11f5aa7ce13\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:21.507346 master-0 kubenswrapper[13205]: I0319 09:33:21.507270 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/01ef8262f7214653fecba11f5aa7ce13-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"01ef8262f7214653fecba11f5aa7ce13\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:21.507346 master-0 kubenswrapper[13205]: I0319 09:33:21.507343 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/01ef8262f7214653fecba11f5aa7ce13-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"01ef8262f7214653fecba11f5aa7ce13\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:21.608076 master-0 kubenswrapper[13205]: I0319 09:33:21.608020 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-resource-dir\") pod \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " Mar 19 09:33:21.608278 master-0 kubenswrapper[13205]: I0319 09:33:21.608179 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-cert-dir\") pod \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\" (UID: \"a474cbd3d0d9d7ed4d0ff461a5e5fe1a\") " Mar 19 09:33:21.608364 master-0 kubenswrapper[13205]: I0319 09:33:21.608323 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "a474cbd3d0d9d7ed4d0ff461a5e5fe1a" (UID: "a474cbd3d0d9d7ed4d0ff461a5e5fe1a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:33:21.608495 master-0 kubenswrapper[13205]: I0319 09:33:21.608462 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "a474cbd3d0d9d7ed4d0ff461a5e5fe1a" (UID: "a474cbd3d0d9d7ed4d0ff461a5e5fe1a"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:33:21.608580 master-0 kubenswrapper[13205]: I0319 09:33:21.608548 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:21.615174 master-0 kubenswrapper[13205]: I0319 09:33:21.615125 13205 generic.go:334] "Generic (PLEG): container finished" podID="e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" containerID="93aa177a299a996e489a2b97d047f9a2e5110bba02b78838ebfd5ecc32f7b998" exitCode=0 Mar 19 09:33:21.615347 master-0 kubenswrapper[13205]: I0319 09:33:21.615200 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4","Type":"ContainerDied","Data":"93aa177a299a996e489a2b97d047f9a2e5110bba02b78838ebfd5ecc32f7b998"} Mar 19 09:33:21.618032 master-0 kubenswrapper[13205]: I0319 09:33:21.617982 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager-cert-syncer/1.log" Mar 19 09:33:21.618831 master-0 kubenswrapper[13205]: I0319 09:33:21.618804 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/cluster-policy-controller/2.log" Mar 19 09:33:21.619567 master-0 kubenswrapper[13205]: I0319 09:33:21.619511 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager/3.log" Mar 19 09:33:21.620656 master-0 kubenswrapper[13205]: I0319 09:33:21.620623 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager-cert-syncer/0.log" Mar 19 09:33:21.620724 master-0 kubenswrapper[13205]: I0319 09:33:21.620671 13205 generic.go:334] "Generic (PLEG): container finished" podID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerID="6586d1ef1d8b389dab4a4ad49608dcd75cf745858d90f13a98e0648bcd092731" exitCode=0 Mar 19 09:33:21.620724 master-0 kubenswrapper[13205]: I0319 09:33:21.620690 13205 generic.go:334] "Generic (PLEG): container finished" podID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerID="826550ebd0b1d6be98355aa6c853b794d4484edfacd28e4eb43f5eadc79826f2" exitCode=2 Mar 19 09:33:21.620724 master-0 kubenswrapper[13205]: I0319 09:33:21.620701 13205 generic.go:334] "Generic (PLEG): container finished" podID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerID="b9c3e8b758bdc9a75844b0278cf27a3810e791645794ec44f1bef75175922fcf" exitCode=0 Mar 19 09:33:21.620828 master-0 kubenswrapper[13205]: I0319 09:33:21.620724 13205 generic.go:334] "Generic (PLEG): container finished" podID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" containerID="e7ac13cba0a41afefd1f1913bc7aba4a187c6d99752100ec1e36b10b44ac9c6a" exitCode=0 Mar 19 09:33:21.620828 master-0 kubenswrapper[13205]: I0319 09:33:21.620762 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1132a72c9136c6d33d6382355fa3991b260d8a3776fc503599fe4ecedb8985b2" Mar 19 09:33:21.620828 master-0 kubenswrapper[13205]: I0319 09:33:21.620780 13205 scope.go:117] "RemoveContainer" containerID="376660e439ae8f2c02ca4c362c919c890d0ad33fc46053d8caa8f4c5abc8c5d8" Mar 19 09:33:21.620913 master-0 kubenswrapper[13205]: I0319 09:33:21.620897 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:21.637271 master-0 kubenswrapper[13205]: I0319 09:33:21.637198 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" podUID="01ef8262f7214653fecba11f5aa7ce13" Mar 19 09:33:21.640623 master-0 kubenswrapper[13205]: I0319 09:33:21.638769 13205 scope.go:117] "RemoveContainer" containerID="d2773f59c5e5fc7c4c20d27964b8855d429ffb69ddd44594d1e039aab3c6d9c7" Mar 19 09:33:21.644944 master-0 kubenswrapper[13205]: I0319 09:33:21.644876 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" podUID="01ef8262f7214653fecba11f5aa7ce13" Mar 19 09:33:21.655785 master-0 kubenswrapper[13205]: I0319 09:33:21.655744 13205 scope.go:117] "RemoveContainer" containerID="111f6b2ee66d02abafb1557877f87d32e2a7993224942e1860bba961d2423e46" Mar 19 09:33:21.710257 master-0 kubenswrapper[13205]: I0319 09:33:21.710206 13205 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a474cbd3d0d9d7ed4d0ff461a5e5fe1a-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:22.631195 master-0 kubenswrapper[13205]: I0319 09:33:22.631134 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a474cbd3d0d9d7ed4d0ff461a5e5fe1a/kube-controller-manager-cert-syncer/1.log" Mar 19 09:33:22.861516 master-0 kubenswrapper[13205]: I0319 09:33:22.861471 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a474cbd3d0d9d7ed4d0ff461a5e5fe1a" path="/var/lib/kubelet/pods/a474cbd3d0d9d7ed4d0ff461a5e5fe1a/volumes" Mar 19 09:33:22.969568 master-0 kubenswrapper[13205]: I0319 09:33:22.969511 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:33:23.131355 master-0 kubenswrapper[13205]: I0319 09:33:23.131269 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-kubelet-dir\") pod \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\" (UID: \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\") " Mar 19 09:33:23.131652 master-0 kubenswrapper[13205]: I0319 09:33:23.131396 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-kube-api-access\") pod \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\" (UID: \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\") " Mar 19 09:33:23.131652 master-0 kubenswrapper[13205]: I0319 09:33:23.131477 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-var-lock\") pod \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\" (UID: \"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4\") " Mar 19 09:33:23.131652 master-0 kubenswrapper[13205]: I0319 09:33:23.131382 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" (UID: "e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:33:23.131871 master-0 kubenswrapper[13205]: I0319 09:33:23.131738 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-var-lock" (OuterVolumeSpecName: "var-lock") pod "e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" (UID: "e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:33:23.131978 master-0 kubenswrapper[13205]: I0319 09:33:23.131943 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:23.131978 master-0 kubenswrapper[13205]: I0319 09:33:23.131973 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:23.134158 master-0 kubenswrapper[13205]: I0319 09:33:23.134115 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" (UID: "e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:33:23.233613 master-0 kubenswrapper[13205]: I0319 09:33:23.233455 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:23.640623 master-0 kubenswrapper[13205]: I0319 09:33:23.640581 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4","Type":"ContainerDied","Data":"d5662f02e0c233de962a55aa5a4b5f0bff1496b04c43b153ee06f94e129b37f3"} Mar 19 09:33:23.641305 master-0 kubenswrapper[13205]: I0319 09:33:23.641277 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5662f02e0c233de962a55aa5a4b5f0bff1496b04c43b153ee06f94e129b37f3" Mar 19 09:33:23.641581 master-0 kubenswrapper[13205]: I0319 09:33:23.641544 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:33:27.144722 master-0 kubenswrapper[13205]: I0319 09:33:27.144667 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:33:27.145596 master-0 kubenswrapper[13205]: I0319 09:33:27.145546 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:33:27.145835 master-0 kubenswrapper[13205]: I0319 09:33:27.145810 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:33:27.146697 master-0 kubenswrapper[13205]: I0319 09:33:27.146668 13205 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console" containerStatusID={"Type":"cri-o","ID":"3126c808e06276b72f50b1bcec104cc8290fd8a1252c1d1a5a621abc3da492cd"} pod="openshift-console/console-79f67cdc89-bx72w" containerMessage="Container console failed startup probe, will be restarted" Mar 19 09:33:27.200321 master-0 kubenswrapper[13205]: I0319 09:33:27.200253 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 19 09:33:27.200618 master-0 kubenswrapper[13205]: E0319 09:33:27.200590 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" containerName="installer" Mar 19 09:33:27.200618 master-0 kubenswrapper[13205]: I0319 09:33:27.200611 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" containerName="installer" Mar 19 09:33:27.200796 master-0 kubenswrapper[13205]: I0319 09:33:27.200769 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4" containerName="installer" Mar 19 09:33:27.201348 master-0 kubenswrapper[13205]: I0319 09:33:27.201308 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:33:27.203265 master-0 kubenswrapper[13205]: I0319 09:33:27.203203 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 09:33:27.204933 master-0 kubenswrapper[13205]: I0319 09:33:27.204874 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-xzz4b" Mar 19 09:33:27.216784 master-0 kubenswrapper[13205]: I0319 09:33:27.216705 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 19 09:33:27.301030 master-0 kubenswrapper[13205]: I0319 09:33:27.300968 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/621a61b9-69f0-4bbe-ae33-56a4473c72ee-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:33:27.301361 master-0 kubenswrapper[13205]: I0319 09:33:27.301317 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/621a61b9-69f0-4bbe-ae33-56a4473c72ee-var-lock\") pod \"installer-5-master-0\" (UID: \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:33:27.301962 master-0 kubenswrapper[13205]: I0319 09:33:27.301913 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/621a61b9-69f0-4bbe-ae33-56a4473c72ee-kube-api-access\") pod \"installer-5-master-0\" (UID: \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:33:27.403550 master-0 kubenswrapper[13205]: I0319 09:33:27.403391 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/621a61b9-69f0-4bbe-ae33-56a4473c72ee-kube-api-access\") pod \"installer-5-master-0\" (UID: \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:33:27.403550 master-0 kubenswrapper[13205]: I0319 09:33:27.403482 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/621a61b9-69f0-4bbe-ae33-56a4473c72ee-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:33:27.403760 master-0 kubenswrapper[13205]: I0319 09:33:27.403658 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/621a61b9-69f0-4bbe-ae33-56a4473c72ee-var-lock\") pod \"installer-5-master-0\" (UID: \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:33:27.403844 master-0 kubenswrapper[13205]: I0319 09:33:27.403814 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/621a61b9-69f0-4bbe-ae33-56a4473c72ee-var-lock\") pod \"installer-5-master-0\" (UID: \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:33:27.404006 master-0 kubenswrapper[13205]: I0319 09:33:27.403970 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/621a61b9-69f0-4bbe-ae33-56a4473c72ee-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:33:27.424036 master-0 kubenswrapper[13205]: I0319 09:33:27.423971 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/621a61b9-69f0-4bbe-ae33-56a4473c72ee-kube-api-access\") pod \"installer-5-master-0\" (UID: \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:33:27.518132 master-0 kubenswrapper[13205]: I0319 09:33:27.518081 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:33:28.002275 master-0 kubenswrapper[13205]: I0319 09:33:28.002202 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 19 09:33:28.681937 master-0 kubenswrapper[13205]: I0319 09:33:28.681779 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"621a61b9-69f0-4bbe-ae33-56a4473c72ee","Type":"ContainerStarted","Data":"e7bf368fcae180a0ba7541554d44ad054ecd176c57a65be6c28b9187d83dd5f6"} Mar 19 09:33:28.682998 master-0 kubenswrapper[13205]: I0319 09:33:28.682947 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"621a61b9-69f0-4bbe-ae33-56a4473c72ee","Type":"ContainerStarted","Data":"4afcbd4532566468202978264189558122552ec1de0de3ab5ebc6b167bd4e785"} Mar 19 09:33:28.707228 master-0 kubenswrapper[13205]: I0319 09:33:28.707105 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=1.7070812549999999 podStartE2EDuration="1.707081255s" podCreationTimestamp="2026-03-19 09:33:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:33:28.705606328 +0000 UTC m=+594.037913266" watchObservedRunningTime="2026-03-19 09:33:28.707081255 +0000 UTC m=+594.039388163" Mar 19 09:33:31.699059 master-0 kubenswrapper[13205]: I0319 09:33:31.698998 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 09:33:31.700007 master-0 kubenswrapper[13205]: I0319 09:33:31.699981 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:33:31.703541 master-0 kubenswrapper[13205]: I0319 09:33:31.703502 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 09:33:31.703864 master-0 kubenswrapper[13205]: I0319 09:33:31.703843 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-nm2j7" Mar 19 09:33:31.714119 master-0 kubenswrapper[13205]: I0319 09:33:31.713437 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 09:33:31.791392 master-0 kubenswrapper[13205]: I0319 09:33:31.791307 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7b5036d-9738-4e7e-a11f-ed64194ea30f-kube-api-access\") pod \"installer-3-master-0\" (UID: \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:33:31.791392 master-0 kubenswrapper[13205]: I0319 09:33:31.791400 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7b5036d-9738-4e7e-a11f-ed64194ea30f-var-lock\") pod \"installer-3-master-0\" (UID: \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:33:31.792235 master-0 kubenswrapper[13205]: I0319 09:33:31.791425 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7b5036d-9738-4e7e-a11f-ed64194ea30f-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:33:31.892362 master-0 kubenswrapper[13205]: I0319 09:33:31.892307 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7b5036d-9738-4e7e-a11f-ed64194ea30f-kube-api-access\") pod \"installer-3-master-0\" (UID: \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:33:31.892646 master-0 kubenswrapper[13205]: I0319 09:33:31.892392 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7b5036d-9738-4e7e-a11f-ed64194ea30f-var-lock\") pod \"installer-3-master-0\" (UID: \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:33:31.892646 master-0 kubenswrapper[13205]: I0319 09:33:31.892565 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7b5036d-9738-4e7e-a11f-ed64194ea30f-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:33:31.892753 master-0 kubenswrapper[13205]: I0319 09:33:31.892649 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7b5036d-9738-4e7e-a11f-ed64194ea30f-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:33:31.892912 master-0 kubenswrapper[13205]: I0319 09:33:31.892861 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7b5036d-9738-4e7e-a11f-ed64194ea30f-var-lock\") pod \"installer-3-master-0\" (UID: \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:33:31.914595 master-0 kubenswrapper[13205]: I0319 09:33:31.913793 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7b5036d-9738-4e7e-a11f-ed64194ea30f-kube-api-access\") pod \"installer-3-master-0\" (UID: \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:33:32.026120 master-0 kubenswrapper[13205]: I0319 09:33:32.026038 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:33:32.474920 master-0 kubenswrapper[13205]: I0319 09:33:32.473732 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 09:33:32.747508 master-0 kubenswrapper[13205]: I0319 09:33:32.747340 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"f7b5036d-9738-4e7e-a11f-ed64194ea30f","Type":"ContainerStarted","Data":"f9221f0cda6995023043a3926b6b04edb9f0474b37a79ab4819d0f125bcb4d0e"} Mar 19 09:33:32.848416 master-0 kubenswrapper[13205]: I0319 09:33:32.848367 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:32.872882 master-0 kubenswrapper[13205]: I0319 09:33:32.872827 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="64f3b074-8e48-41b2-b30b-c0592d3133ff" Mar 19 09:33:32.872882 master-0 kubenswrapper[13205]: I0319 09:33:32.872870 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="64f3b074-8e48-41b2-b30b-c0592d3133ff" Mar 19 09:33:32.882724 master-0 kubenswrapper[13205]: I0319 09:33:32.882680 13205 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:32.885226 master-0 kubenswrapper[13205]: I0319 09:33:32.885190 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:33:32.889630 master-0 kubenswrapper[13205]: I0319 09:33:32.889608 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:32.889958 master-0 kubenswrapper[13205]: I0319 09:33:32.889901 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:33:32.894035 master-0 kubenswrapper[13205]: I0319 09:33:32.893977 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:33:33.758577 master-0 kubenswrapper[13205]: I0319 09:33:33.758235 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"f7b5036d-9738-4e7e-a11f-ed64194ea30f","Type":"ContainerStarted","Data":"ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8"} Mar 19 09:33:33.765760 master-0 kubenswrapper[13205]: I0319 09:33:33.765713 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"01ef8262f7214653fecba11f5aa7ce13","Type":"ContainerStarted","Data":"f46b0b23ccdc4101d15fea4308a57f8af72710fa5156b459dd9c1fc3d0424ef4"} Mar 19 09:33:33.765760 master-0 kubenswrapper[13205]: I0319 09:33:33.765764 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"01ef8262f7214653fecba11f5aa7ce13","Type":"ContainerStarted","Data":"4b8296c8aab85c007fe985852836d80847020fe70c583a261ac67856bf44c2bf"} Mar 19 09:33:33.765992 master-0 kubenswrapper[13205]: I0319 09:33:33.765779 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"01ef8262f7214653fecba11f5aa7ce13","Type":"ContainerStarted","Data":"234932c6aa4854708a10bfd6ff5c0b2a32a6ce550c7885888734f2d1075fb3a5"} Mar 19 09:33:33.765992 master-0 kubenswrapper[13205]: I0319 09:33:33.765796 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"01ef8262f7214653fecba11f5aa7ce13","Type":"ContainerStarted","Data":"8da454a69acb933dd50995cff47f3884f390cb29b4385920af6216097e647256"} Mar 19 09:33:33.808999 master-0 kubenswrapper[13205]: I0319 09:33:33.808889 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=2.8088692269999997 podStartE2EDuration="2.808869227s" podCreationTimestamp="2026-03-19 09:33:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:33:33.804014037 +0000 UTC m=+599.136320925" watchObservedRunningTime="2026-03-19 09:33:33.808869227 +0000 UTC m=+599.141176115" Mar 19 09:33:34.791811 master-0 kubenswrapper[13205]: I0319 09:33:34.791628 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"01ef8262f7214653fecba11f5aa7ce13","Type":"ContainerStarted","Data":"e11c7067a7cc9283dccf50eb10db382afb4e377743f71c297da2c1fc383ce771"} Mar 19 09:33:34.815315 master-0 kubenswrapper[13205]: I0319 09:33:34.815156 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.815141515 podStartE2EDuration="2.815141515s" podCreationTimestamp="2026-03-19 09:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:33:34.813419502 +0000 UTC m=+600.145726390" watchObservedRunningTime="2026-03-19 09:33:34.815141515 +0000 UTC m=+600.147448403" Mar 19 09:33:35.210727 master-0 kubenswrapper[13205]: E0319 09:33:35.210593 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:33:35.770253 master-0 kubenswrapper[13205]: E0319 09:33:35.770186 13205 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command 'sleep 25' exited with 137: " execCommand=["sleep","25"] containerName="console" pod="openshift-console/console-79f67cdc89-bx72w" message="" Mar 19 09:33:35.770253 master-0 kubenswrapper[13205]: E0319 09:33:35.770230 13205 kuberuntime_container.go:691] "PreStop hook failed" err="command 'sleep 25' exited with 137: " pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" containerID="cri-o://3126c808e06276b72f50b1bcec104cc8290fd8a1252c1d1a5a621abc3da492cd" Mar 19 09:33:35.770458 master-0 kubenswrapper[13205]: I0319 09:33:35.770270 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" containerID="cri-o://3126c808e06276b72f50b1bcec104cc8290fd8a1252c1d1a5a621abc3da492cd" gracePeriod=32 Mar 19 09:33:35.798995 master-0 kubenswrapper[13205]: I0319 09:33:35.798953 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79f67cdc89-bx72w_5a8e5bd7-de13-4773-8a38-5edf4fda23fd/console/0.log" Mar 19 09:33:35.799440 master-0 kubenswrapper[13205]: I0319 09:33:35.799003 13205 generic.go:334] "Generic (PLEG): container finished" podID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerID="3126c808e06276b72f50b1bcec104cc8290fd8a1252c1d1a5a621abc3da492cd" exitCode=255 Mar 19 09:33:35.799440 master-0 kubenswrapper[13205]: I0319 09:33:35.799044 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79f67cdc89-bx72w" event={"ID":"5a8e5bd7-de13-4773-8a38-5edf4fda23fd","Type":"ContainerDied","Data":"3126c808e06276b72f50b1bcec104cc8290fd8a1252c1d1a5a621abc3da492cd"} Mar 19 09:33:36.808353 master-0 kubenswrapper[13205]: I0319 09:33:36.808267 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79f67cdc89-bx72w_5a8e5bd7-de13-4773-8a38-5edf4fda23fd/console/0.log" Mar 19 09:33:36.808353 master-0 kubenswrapper[13205]: I0319 09:33:36.808355 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79f67cdc89-bx72w" event={"ID":"5a8e5bd7-de13-4773-8a38-5edf4fda23fd","Type":"ContainerStarted","Data":"a310c2e6c4b3f15606e08a140ed88a386ad094d2c2a0c14e05f5a9c148af6b08"} Mar 19 09:33:37.144662 master-0 kubenswrapper[13205]: I0319 09:33:37.144398 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:33:37.144957 master-0 kubenswrapper[13205]: I0319 09:33:37.144854 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:33:37.145242 master-0 kubenswrapper[13205]: I0319 09:33:37.145193 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:33:37.145474 master-0 kubenswrapper[13205]: I0319 09:33:37.145431 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:33:39.661607 master-0 kubenswrapper[13205]: I0319 09:33:39.661513 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 09:33:39.662229 master-0 kubenswrapper[13205]: I0319 09:33:39.661823 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-4-master-0" podUID="f84f3f5a-d1da-4e9e-bfae-e6264c751372" containerName="installer" containerID="cri-o://3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8" gracePeriod=30 Mar 19 09:33:40.214882 master-0 kubenswrapper[13205]: I0319 09:33:40.214762 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_f84f3f5a-d1da-4e9e-bfae-e6264c751372/installer/0.log" Mar 19 09:33:40.214882 master-0 kubenswrapper[13205]: I0319 09:33:40.214839 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:40.411161 master-0 kubenswrapper[13205]: I0319 09:33:40.411091 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f84f3f5a-d1da-4e9e-bfae-e6264c751372-kubelet-dir\") pod \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\" (UID: \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\") " Mar 19 09:33:40.411357 master-0 kubenswrapper[13205]: I0319 09:33:40.411180 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f84f3f5a-d1da-4e9e-bfae-e6264c751372-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f84f3f5a-d1da-4e9e-bfae-e6264c751372" (UID: "f84f3f5a-d1da-4e9e-bfae-e6264c751372"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:33:40.411357 master-0 kubenswrapper[13205]: I0319 09:33:40.411276 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f84f3f5a-d1da-4e9e-bfae-e6264c751372-kube-api-access\") pod \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\" (UID: \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\") " Mar 19 09:33:40.411357 master-0 kubenswrapper[13205]: I0319 09:33:40.411317 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f84f3f5a-d1da-4e9e-bfae-e6264c751372-var-lock\") pod \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\" (UID: \"f84f3f5a-d1da-4e9e-bfae-e6264c751372\") " Mar 19 09:33:40.411462 master-0 kubenswrapper[13205]: I0319 09:33:40.411449 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f84f3f5a-d1da-4e9e-bfae-e6264c751372-var-lock" (OuterVolumeSpecName: "var-lock") pod "f84f3f5a-d1da-4e9e-bfae-e6264c751372" (UID: "f84f3f5a-d1da-4e9e-bfae-e6264c751372"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:33:40.411811 master-0 kubenswrapper[13205]: I0319 09:33:40.411774 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f84f3f5a-d1da-4e9e-bfae-e6264c751372-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:40.411854 master-0 kubenswrapper[13205]: I0319 09:33:40.411812 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f84f3f5a-d1da-4e9e-bfae-e6264c751372-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:40.413941 master-0 kubenswrapper[13205]: I0319 09:33:40.413900 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f84f3f5a-d1da-4e9e-bfae-e6264c751372-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f84f3f5a-d1da-4e9e-bfae-e6264c751372" (UID: "f84f3f5a-d1da-4e9e-bfae-e6264c751372"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:33:40.512291 master-0 kubenswrapper[13205]: I0319 09:33:40.512189 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f84f3f5a-d1da-4e9e-bfae-e6264c751372-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:40.841477 master-0 kubenswrapper[13205]: I0319 09:33:40.841360 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_f84f3f5a-d1da-4e9e-bfae-e6264c751372/installer/0.log" Mar 19 09:33:40.841477 master-0 kubenswrapper[13205]: I0319 09:33:40.841423 13205 generic.go:334] "Generic (PLEG): container finished" podID="f84f3f5a-d1da-4e9e-bfae-e6264c751372" containerID="3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8" exitCode=1 Mar 19 09:33:40.841477 master-0 kubenswrapper[13205]: I0319 09:33:40.841457 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"f84f3f5a-d1da-4e9e-bfae-e6264c751372","Type":"ContainerDied","Data":"3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8"} Mar 19 09:33:40.841477 master-0 kubenswrapper[13205]: I0319 09:33:40.841485 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"f84f3f5a-d1da-4e9e-bfae-e6264c751372","Type":"ContainerDied","Data":"f31a0b1f8eafa06f8d2cd14f7ac5f32f093e33ea1d360285debc9fd4a89b85a0"} Mar 19 09:33:40.842684 master-0 kubenswrapper[13205]: I0319 09:33:40.841505 13205 scope.go:117] "RemoveContainer" containerID="3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8" Mar 19 09:33:40.842684 master-0 kubenswrapper[13205]: I0319 09:33:40.841741 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:33:40.883092 master-0 kubenswrapper[13205]: I0319 09:33:40.883051 13205 scope.go:117] "RemoveContainer" containerID="3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8" Mar 19 09:33:40.883682 master-0 kubenswrapper[13205]: E0319 09:33:40.883643 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8\": container with ID starting with 3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8 not found: ID does not exist" containerID="3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8" Mar 19 09:33:40.883758 master-0 kubenswrapper[13205]: I0319 09:33:40.883697 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8"} err="failed to get container status \"3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8\": rpc error: code = NotFound desc = could not find container \"3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8\": container with ID starting with 3b7e2b82a800961033580be965b7421cc6148ade3e66f3f71deac229fafd76d8 not found: ID does not exist" Mar 19 09:33:40.884432 master-0 kubenswrapper[13205]: I0319 09:33:40.884393 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 09:33:40.895896 master-0 kubenswrapper[13205]: I0319 09:33:40.895600 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 09:33:42.857239 master-0 kubenswrapper[13205]: I0319 09:33:42.857097 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f84f3f5a-d1da-4e9e-bfae-e6264c751372" path="/var/lib/kubelet/pods/f84f3f5a-d1da-4e9e-bfae-e6264c751372/volumes" Mar 19 09:33:42.881307 master-0 kubenswrapper[13205]: I0319 09:33:42.881229 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 09:33:42.881607 master-0 kubenswrapper[13205]: I0319 09:33:42.881518 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-3-master-0" podUID="f7b5036d-9738-4e7e-a11f-ed64194ea30f" containerName="installer" containerID="cri-o://ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8" gracePeriod=30 Mar 19 09:33:42.889829 master-0 kubenswrapper[13205]: I0319 09:33:42.889780 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:42.889829 master-0 kubenswrapper[13205]: I0319 09:33:42.889827 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:42.889829 master-0 kubenswrapper[13205]: I0319 09:33:42.889837 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:42.890028 master-0 kubenswrapper[13205]: I0319 09:33:42.889847 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:42.893920 master-0 kubenswrapper[13205]: I0319 09:33:42.893858 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:42.894299 master-0 kubenswrapper[13205]: I0319 09:33:42.894257 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:43.680561 master-0 kubenswrapper[13205]: I0319 09:33:43.680443 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 09:33:43.680955 master-0 kubenswrapper[13205]: E0319 09:33:43.680911 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84f3f5a-d1da-4e9e-bfae-e6264c751372" containerName="installer" Mar 19 09:33:43.680955 master-0 kubenswrapper[13205]: I0319 09:33:43.680944 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84f3f5a-d1da-4e9e-bfae-e6264c751372" containerName="installer" Mar 19 09:33:43.681196 master-0 kubenswrapper[13205]: I0319 09:33:43.681173 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="f84f3f5a-d1da-4e9e-bfae-e6264c751372" containerName="installer" Mar 19 09:33:43.681910 master-0 kubenswrapper[13205]: I0319 09:33:43.681852 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:33:43.685079 master-0 kubenswrapper[13205]: I0319 09:33:43.685019 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 09:33:43.686285 master-0 kubenswrapper[13205]: I0319 09:33:43.686231 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-qzwhq" Mar 19 09:33:43.701049 master-0 kubenswrapper[13205]: I0319 09:33:43.700968 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 09:33:43.858236 master-0 kubenswrapper[13205]: I0319 09:33:43.858164 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85810df1-4989-449a-8da0-192c8720d5f4-var-lock\") pod \"installer-5-master-0\" (UID: \"85810df1-4989-449a-8da0-192c8720d5f4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:33:43.858936 master-0 kubenswrapper[13205]: I0319 09:33:43.858281 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85810df1-4989-449a-8da0-192c8720d5f4-kube-api-access\") pod \"installer-5-master-0\" (UID: \"85810df1-4989-449a-8da0-192c8720d5f4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:33:43.858936 master-0 kubenswrapper[13205]: I0319 09:33:43.858851 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85810df1-4989-449a-8da0-192c8720d5f4-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"85810df1-4989-449a-8da0-192c8720d5f4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:33:43.877605 master-0 kubenswrapper[13205]: I0319 09:33:43.877516 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:43.879826 master-0 kubenswrapper[13205]: I0319 09:33:43.879780 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:43.960765 master-0 kubenswrapper[13205]: I0319 09:33:43.960645 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85810df1-4989-449a-8da0-192c8720d5f4-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"85810df1-4989-449a-8da0-192c8720d5f4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:33:43.960932 master-0 kubenswrapper[13205]: I0319 09:33:43.960765 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85810df1-4989-449a-8da0-192c8720d5f4-var-lock\") pod \"installer-5-master-0\" (UID: \"85810df1-4989-449a-8da0-192c8720d5f4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:33:43.960932 master-0 kubenswrapper[13205]: I0319 09:33:43.960766 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85810df1-4989-449a-8da0-192c8720d5f4-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"85810df1-4989-449a-8da0-192c8720d5f4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:33:43.961003 master-0 kubenswrapper[13205]: I0319 09:33:43.960931 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85810df1-4989-449a-8da0-192c8720d5f4-var-lock\") pod \"installer-5-master-0\" (UID: \"85810df1-4989-449a-8da0-192c8720d5f4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:33:43.961003 master-0 kubenswrapper[13205]: I0319 09:33:43.960977 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85810df1-4989-449a-8da0-192c8720d5f4-kube-api-access\") pod \"installer-5-master-0\" (UID: \"85810df1-4989-449a-8da0-192c8720d5f4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:33:43.980928 master-0 kubenswrapper[13205]: I0319 09:33:43.980899 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85810df1-4989-449a-8da0-192c8720d5f4-kube-api-access\") pod \"installer-5-master-0\" (UID: \"85810df1-4989-449a-8da0-192c8720d5f4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:33:44.010718 master-0 kubenswrapper[13205]: I0319 09:33:44.010671 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:33:44.449348 master-0 kubenswrapper[13205]: I0319 09:33:44.449260 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 09:33:44.451821 master-0 kubenswrapper[13205]: W0319 09:33:44.451763 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod85810df1_4989_449a_8da0_192c8720d5f4.slice/crio-aa07c08eded51738a9a43b5549e93c3b74348793377d3b5c9d05584e1dc87795 WatchSource:0}: Error finding container aa07c08eded51738a9a43b5549e93c3b74348793377d3b5c9d05584e1dc87795: Status 404 returned error can't find the container with id aa07c08eded51738a9a43b5549e93c3b74348793377d3b5c9d05584e1dc87795 Mar 19 09:33:44.894754 master-0 kubenswrapper[13205]: I0319 09:33:44.894692 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"85810df1-4989-449a-8da0-192c8720d5f4","Type":"ContainerStarted","Data":"c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8"} Mar 19 09:33:44.894754 master-0 kubenswrapper[13205]: I0319 09:33:44.894750 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"85810df1-4989-449a-8da0-192c8720d5f4","Type":"ContainerStarted","Data":"aa07c08eded51738a9a43b5549e93c3b74348793377d3b5c9d05584e1dc87795"} Mar 19 09:33:44.915215 master-0 kubenswrapper[13205]: I0319 09:33:44.915151 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=1.91513133 podStartE2EDuration="1.91513133s" podCreationTimestamp="2026-03-19 09:33:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:33:44.914923135 +0000 UTC m=+610.247230043" watchObservedRunningTime="2026-03-19 09:33:44.91513133 +0000 UTC m=+610.247438218" Mar 19 09:33:45.081030 master-0 kubenswrapper[13205]: I0319 09:33:45.080964 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 19 09:33:45.082068 master-0 kubenswrapper[13205]: I0319 09:33:45.082040 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:33:45.093365 master-0 kubenswrapper[13205]: I0319 09:33:45.093308 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 19 09:33:45.179647 master-0 kubenswrapper[13205]: I0319 09:33:45.179600 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3cb82068-9d79-4917-88fd-07cd7a9adbb4-var-lock\") pod \"installer-4-master-0\" (UID: \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:33:45.179933 master-0 kubenswrapper[13205]: I0319 09:33:45.179913 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3cb82068-9d79-4917-88fd-07cd7a9adbb4-kube-api-access\") pod \"installer-4-master-0\" (UID: \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:33:45.180143 master-0 kubenswrapper[13205]: I0319 09:33:45.180127 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3cb82068-9d79-4917-88fd-07cd7a9adbb4-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:33:45.281479 master-0 kubenswrapper[13205]: I0319 09:33:45.281376 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3cb82068-9d79-4917-88fd-07cd7a9adbb4-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:33:45.281773 master-0 kubenswrapper[13205]: I0319 09:33:45.281708 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3cb82068-9d79-4917-88fd-07cd7a9adbb4-var-lock\") pod \"installer-4-master-0\" (UID: \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:33:45.281972 master-0 kubenswrapper[13205]: I0319 09:33:45.281934 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3cb82068-9d79-4917-88fd-07cd7a9adbb4-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:33:45.282135 master-0 kubenswrapper[13205]: I0319 09:33:45.281945 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3cb82068-9d79-4917-88fd-07cd7a9adbb4-kube-api-access\") pod \"installer-4-master-0\" (UID: \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:33:45.282375 master-0 kubenswrapper[13205]: I0319 09:33:45.282345 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3cb82068-9d79-4917-88fd-07cd7a9adbb4-var-lock\") pod \"installer-4-master-0\" (UID: \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:33:45.300642 master-0 kubenswrapper[13205]: I0319 09:33:45.300591 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3cb82068-9d79-4917-88fd-07cd7a9adbb4-kube-api-access\") pod \"installer-4-master-0\" (UID: \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:33:45.419586 master-0 kubenswrapper[13205]: I0319 09:33:45.419393 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:33:45.886562 master-0 kubenswrapper[13205]: I0319 09:33:45.886469 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 19 09:33:45.902020 master-0 kubenswrapper[13205]: I0319 09:33:45.901959 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"3cb82068-9d79-4917-88fd-07cd7a9adbb4","Type":"ContainerStarted","Data":"20b9c18b57eee9cebcb0a968ade99e915a2d56bf0568ccfa90cc604f27af0e06"} Mar 19 09:33:46.911683 master-0 kubenswrapper[13205]: I0319 09:33:46.911548 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"3cb82068-9d79-4917-88fd-07cd7a9adbb4","Type":"ContainerStarted","Data":"2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb"} Mar 19 09:33:47.020271 master-0 kubenswrapper[13205]: I0319 09:33:47.020182 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=2.020161271 podStartE2EDuration="2.020161271s" podCreationTimestamp="2026-03-19 09:33:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:33:47.019908795 +0000 UTC m=+612.352215793" watchObservedRunningTime="2026-03-19 09:33:47.020161271 +0000 UTC m=+612.352468159" Mar 19 09:33:47.144609 master-0 kubenswrapper[13205]: I0319 09:33:47.144553 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:33:47.144833 master-0 kubenswrapper[13205]: I0319 09:33:47.144626 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:33:50.663123 master-0 kubenswrapper[13205]: I0319 09:33:50.663021 13205 scope.go:117] "RemoveContainer" containerID="e7ac13cba0a41afefd1f1913bc7aba4a187c6d99752100ec1e36b10b44ac9c6a" Mar 19 09:33:52.634081 master-0 kubenswrapper[13205]: I0319 09:33:52.626556 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:33:52.634081 master-0 kubenswrapper[13205]: I0319 09:33:52.627214 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="kube-rbac-proxy-metric" containerID="cri-o://b58230ea6728c09693b47e9b3c47f1cdfd5bb13858f7d22418d0a0bdedc1cc40" gracePeriod=120 Mar 19 09:33:52.634081 master-0 kubenswrapper[13205]: I0319 09:33:52.627724 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="prom-label-proxy" containerID="cri-o://0c056e4adcec340ff25f4bf0e81c1d601d8672e70e9ed3ad93cb6aaf58259ee8" gracePeriod=120 Mar 19 09:33:52.634081 master-0 kubenswrapper[13205]: I0319 09:33:52.628042 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="kube-rbac-proxy-web" containerID="cri-o://3aa897760287be7bf5678ba84bdc3a14c8994b11d15cd84e87e2e74bc308cfec" gracePeriod=120 Mar 19 09:33:52.634081 master-0 kubenswrapper[13205]: I0319 09:33:52.628126 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="kube-rbac-proxy" containerID="cri-o://f01fafa23647d2c15c12e9ac89a35b57dc0ffe34180d8863d99d69493d129fe2" gracePeriod=120 Mar 19 09:33:52.634081 master-0 kubenswrapper[13205]: I0319 09:33:52.628205 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="config-reloader" containerID="cri-o://4e02fedc03344827c1ab9002c8d47589bccbb85c521ddda8cf2e2351989807a5" gracePeriod=120 Mar 19 09:33:52.634081 master-0 kubenswrapper[13205]: I0319 09:33:52.630311 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="alertmanager" containerID="cri-o://17884e5909f5f6cdc750bcb0af814ac2c5ddff0fda4e8d2db5915fe8b6602930" gracePeriod=120 Mar 19 09:33:52.995867 master-0 kubenswrapper[13205]: I0319 09:33:52.995748 13205 generic.go:334] "Generic (PLEG): container finished" podID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerID="0c056e4adcec340ff25f4bf0e81c1d601d8672e70e9ed3ad93cb6aaf58259ee8" exitCode=0 Mar 19 09:33:52.995867 master-0 kubenswrapper[13205]: I0319 09:33:52.995780 13205 generic.go:334] "Generic (PLEG): container finished" podID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerID="b58230ea6728c09693b47e9b3c47f1cdfd5bb13858f7d22418d0a0bdedc1cc40" exitCode=0 Mar 19 09:33:52.995867 master-0 kubenswrapper[13205]: I0319 09:33:52.995788 13205 generic.go:334] "Generic (PLEG): container finished" podID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerID="f01fafa23647d2c15c12e9ac89a35b57dc0ffe34180d8863d99d69493d129fe2" exitCode=0 Mar 19 09:33:52.995867 master-0 kubenswrapper[13205]: I0319 09:33:52.995795 13205 generic.go:334] "Generic (PLEG): container finished" podID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerID="3aa897760287be7bf5678ba84bdc3a14c8994b11d15cd84e87e2e74bc308cfec" exitCode=0 Mar 19 09:33:52.995867 master-0 kubenswrapper[13205]: I0319 09:33:52.995801 13205 generic.go:334] "Generic (PLEG): container finished" podID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerID="4e02fedc03344827c1ab9002c8d47589bccbb85c521ddda8cf2e2351989807a5" exitCode=0 Mar 19 09:33:52.995867 master-0 kubenswrapper[13205]: I0319 09:33:52.995807 13205 generic.go:334] "Generic (PLEG): container finished" podID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerID="17884e5909f5f6cdc750bcb0af814ac2c5ddff0fda4e8d2db5915fe8b6602930" exitCode=0 Mar 19 09:33:52.995867 master-0 kubenswrapper[13205]: I0319 09:33:52.995825 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerDied","Data":"0c056e4adcec340ff25f4bf0e81c1d601d8672e70e9ed3ad93cb6aaf58259ee8"} Mar 19 09:33:52.995867 master-0 kubenswrapper[13205]: I0319 09:33:52.995850 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerDied","Data":"b58230ea6728c09693b47e9b3c47f1cdfd5bb13858f7d22418d0a0bdedc1cc40"} Mar 19 09:33:52.995867 master-0 kubenswrapper[13205]: I0319 09:33:52.995859 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerDied","Data":"f01fafa23647d2c15c12e9ac89a35b57dc0ffe34180d8863d99d69493d129fe2"} Mar 19 09:33:52.995867 master-0 kubenswrapper[13205]: I0319 09:33:52.995869 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerDied","Data":"3aa897760287be7bf5678ba84bdc3a14c8994b11d15cd84e87e2e74bc308cfec"} Mar 19 09:33:52.995867 master-0 kubenswrapper[13205]: I0319 09:33:52.995880 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerDied","Data":"4e02fedc03344827c1ab9002c8d47589bccbb85c521ddda8cf2e2351989807a5"} Mar 19 09:33:52.996588 master-0 kubenswrapper[13205]: I0319 09:33:52.995888 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerDied","Data":"17884e5909f5f6cdc750bcb0af814ac2c5ddff0fda4e8d2db5915fe8b6602930"} Mar 19 09:33:53.106407 master-0 kubenswrapper[13205]: I0319 09:33:53.106365 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:33:53.201164 master-0 kubenswrapper[13205]: I0319 09:33:53.201073 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-metrics-client-ca\") pod \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " Mar 19 09:33:53.201393 master-0 kubenswrapper[13205]: I0319 09:33:53.201376 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy-web\") pod \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " Mar 19 09:33:53.201469 master-0 kubenswrapper[13205]: I0319 09:33:53.201457 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy\") pod \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " Mar 19 09:33:53.201561 master-0 kubenswrapper[13205]: I0319 09:33:53.201548 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-alertmanager-main-db\") pod \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " Mar 19 09:33:53.201637 master-0 kubenswrapper[13205]: I0319 09:33:53.201587 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" (UID: "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:33:53.201712 master-0 kubenswrapper[13205]: I0319 09:33:53.201700 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-alertmanager-trusted-ca-bundle\") pod \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " Mar 19 09:33:53.201786 master-0 kubenswrapper[13205]: I0319 09:33:53.201775 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-main-tls\") pod \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " Mar 19 09:33:53.201859 master-0 kubenswrapper[13205]: I0319 09:33:53.201847 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-config-volume\") pod \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " Mar 19 09:33:53.201960 master-0 kubenswrapper[13205]: I0319 09:33:53.201947 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " Mar 19 09:33:53.202069 master-0 kubenswrapper[13205]: I0319 09:33:53.202057 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-config-out\") pod \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " Mar 19 09:33:53.202178 master-0 kubenswrapper[13205]: I0319 09:33:53.202166 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-tls-assets\") pod \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " Mar 19 09:33:53.202261 master-0 kubenswrapper[13205]: I0319 09:33:53.202048 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" (UID: "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:33:53.202304 master-0 kubenswrapper[13205]: I0319 09:33:53.202129 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" (UID: "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:33:53.202341 master-0 kubenswrapper[13205]: I0319 09:33:53.202245 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-web-config\") pod \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " Mar 19 09:33:53.202496 master-0 kubenswrapper[13205]: I0319 09:33:53.202440 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x49wh\" (UniqueName: \"kubernetes.io/projected/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-kube-api-access-x49wh\") pod \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\" (UID: \"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b\") " Mar 19 09:33:53.203279 master-0 kubenswrapper[13205]: I0319 09:33:53.203250 13205 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:53.204066 master-0 kubenswrapper[13205]: I0319 09:33:53.204041 13205 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-alertmanager-main-db\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:53.204122 master-0 kubenswrapper[13205]: I0319 09:33:53.204084 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" (UID: "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:33:53.204122 master-0 kubenswrapper[13205]: I0319 09:33:53.204092 13205 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-alertmanager-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:53.204557 master-0 kubenswrapper[13205]: I0319 09:33:53.204504 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" (UID: "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:33:53.204810 master-0 kubenswrapper[13205]: I0319 09:33:53.204786 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" (UID: "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:33:53.206082 master-0 kubenswrapper[13205]: I0319 09:33:53.205512 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-kube-api-access-x49wh" (OuterVolumeSpecName: "kube-api-access-x49wh") pod "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" (UID: "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b"). InnerVolumeSpecName "kube-api-access-x49wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:33:53.206082 master-0 kubenswrapper[13205]: I0319 09:33:53.205913 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-config-out" (OuterVolumeSpecName: "config-out") pod "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" (UID: "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:33:53.206498 master-0 kubenswrapper[13205]: I0319 09:33:53.206464 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" (UID: "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:33:53.206611 master-0 kubenswrapper[13205]: I0319 09:33:53.206512 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" (UID: "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:33:53.206696 master-0 kubenswrapper[13205]: I0319 09:33:53.206630 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-config-volume" (OuterVolumeSpecName: "config-volume") pod "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" (UID: "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:33:53.250156 master-0 kubenswrapper[13205]: I0319 09:33:53.250084 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-web-config" (OuterVolumeSpecName: "web-config") pod "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" (UID: "35d9f4bd-97d8-42be-b5a7-0c8cbf45350b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:33:53.305613 master-0 kubenswrapper[13205]: I0319 09:33:53.305486 13205 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:53.305613 master-0 kubenswrapper[13205]: I0319 09:33:53.305589 13205 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:53.305613 master-0 kubenswrapper[13205]: I0319 09:33:53.305602 13205 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-main-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:53.305613 master-0 kubenswrapper[13205]: I0319 09:33:53.305614 13205 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-config-volume\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:53.305613 master-0 kubenswrapper[13205]: I0319 09:33:53.305626 13205 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:53.306053 master-0 kubenswrapper[13205]: I0319 09:33:53.305636 13205 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-config-out\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:53.306053 master-0 kubenswrapper[13205]: I0319 09:33:53.305646 13205 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:53.306053 master-0 kubenswrapper[13205]: I0319 09:33:53.305654 13205 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-web-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:53.306053 master-0 kubenswrapper[13205]: I0319 09:33:53.305664 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x49wh\" (UniqueName: \"kubernetes.io/projected/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b-kube-api-access-x49wh\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:54.008901 master-0 kubenswrapper[13205]: I0319 09:33:54.008760 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"35d9f4bd-97d8-42be-b5a7-0c8cbf45350b","Type":"ContainerDied","Data":"e5b3016c33ac078d83ccd3c0c97f494889360e9c7848774d638179ce5e816894"} Mar 19 09:33:54.008901 master-0 kubenswrapper[13205]: I0319 09:33:54.008856 13205 scope.go:117] "RemoveContainer" containerID="0c056e4adcec340ff25f4bf0e81c1d601d8672e70e9ed3ad93cb6aaf58259ee8" Mar 19 09:33:54.008901 master-0 kubenswrapper[13205]: I0319 09:33:54.008886 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:33:54.026508 master-0 kubenswrapper[13205]: I0319 09:33:54.026459 13205 scope.go:117] "RemoveContainer" containerID="b58230ea6728c09693b47e9b3c47f1cdfd5bb13858f7d22418d0a0bdedc1cc40" Mar 19 09:33:54.040431 master-0 kubenswrapper[13205]: I0319 09:33:54.040302 13205 scope.go:117] "RemoveContainer" containerID="f01fafa23647d2c15c12e9ac89a35b57dc0ffe34180d8863d99d69493d129fe2" Mar 19 09:33:54.055420 master-0 kubenswrapper[13205]: I0319 09:33:54.055371 13205 scope.go:117] "RemoveContainer" containerID="3aa897760287be7bf5678ba84bdc3a14c8994b11d15cd84e87e2e74bc308cfec" Mar 19 09:33:54.073353 master-0 kubenswrapper[13205]: I0319 09:33:54.073222 13205 scope.go:117] "RemoveContainer" containerID="4e02fedc03344827c1ab9002c8d47589bccbb85c521ddda8cf2e2351989807a5" Mar 19 09:33:54.086276 master-0 kubenswrapper[13205]: I0319 09:33:54.086039 13205 scope.go:117] "RemoveContainer" containerID="17884e5909f5f6cdc750bcb0af814ac2c5ddff0fda4e8d2db5915fe8b6602930" Mar 19 09:33:54.107883 master-0 kubenswrapper[13205]: I0319 09:33:54.104385 13205 scope.go:117] "RemoveContainer" containerID="90bdbbe49c2c62e2be6bd1d4d57ced9f777ee8aff32c998b313fe88cbe77c54e" Mar 19 09:33:54.176084 master-0 kubenswrapper[13205]: I0319 09:33:54.175938 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:33:54.363961 master-0 kubenswrapper[13205]: I0319 09:33:54.363911 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:33:54.856547 master-0 kubenswrapper[13205]: I0319 09:33:54.856476 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" path="/var/lib/kubelet/pods/35d9f4bd-97d8-42be-b5a7-0c8cbf45350b/volumes" Mar 19 09:33:57.144032 master-0 kubenswrapper[13205]: I0319 09:33:57.143970 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:33:57.144032 master-0 kubenswrapper[13205]: I0319 09:33:57.144022 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:33:59.532189 master-0 kubenswrapper[13205]: I0319 09:33:59.532087 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:33:59.532945 master-0 kubenswrapper[13205]: I0319 09:33:59.532515 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-recovery-controller" containerID="cri-o://793cfb93f2346e0ad23e32cbd1e114aae92c03db2ff0726f899f8a1c39d66416" gracePeriod=30 Mar 19 09:33:59.532945 master-0 kubenswrapper[13205]: I0319 09:33:59.532631 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" containerID="cri-o://0a17e7848d06038a69e2540781de2a324d8067bd69c0598df08e190c706b5066" gracePeriod=30 Mar 19 09:33:59.532945 master-0 kubenswrapper[13205]: I0319 09:33:59.532613 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" containerID="cri-o://96f501d33ba99906fcc67f343ffb6c0314c555d3c6113a843511ffa7ed7f311a" gracePeriod=30 Mar 19 09:33:59.533888 master-0 kubenswrapper[13205]: I0319 09:33:59.533833 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:33:59.534210 master-0 kubenswrapper[13205]: E0319 09:33:59.534173 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="wait-for-host-port" Mar 19 09:33:59.534210 master-0 kubenswrapper[13205]: I0319 09:33:59.534203 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="wait-for-host-port" Mar 19 09:33:59.534332 master-0 kubenswrapper[13205]: E0319 09:33:59.534230 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="init-config-reloader" Mar 19 09:33:59.534332 master-0 kubenswrapper[13205]: I0319 09:33:59.534241 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="init-config-reloader" Mar 19 09:33:59.534332 master-0 kubenswrapper[13205]: E0319 09:33:59.534257 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="kube-rbac-proxy" Mar 19 09:33:59.534332 master-0 kubenswrapper[13205]: I0319 09:33:59.534268 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="kube-rbac-proxy" Mar 19 09:33:59.534332 master-0 kubenswrapper[13205]: E0319 09:33:59.534289 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" Mar 19 09:33:59.534332 master-0 kubenswrapper[13205]: I0319 09:33:59.534300 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" Mar 19 09:33:59.534332 master-0 kubenswrapper[13205]: E0319 09:33:59.534320 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="config-reloader" Mar 19 09:33:59.534332 master-0 kubenswrapper[13205]: I0319 09:33:59.534331 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="config-reloader" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: E0319 09:33:59.534348 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: I0319 09:33:59.534359 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: E0319 09:33:59.534378 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="prom-label-proxy" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: I0319 09:33:59.534388 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="prom-label-proxy" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: E0319 09:33:59.534402 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="alertmanager" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: I0319 09:33:59.534412 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="alertmanager" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: E0319 09:33:59.534433 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: I0319 09:33:59.534444 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: E0319 09:33:59.534463 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-recovery-controller" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: I0319 09:33:59.534473 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-recovery-controller" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: E0319 09:33:59.534491 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: I0319 09:33:59.534501 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: E0319 09:33:59.534540 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="kube-rbac-proxy-web" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: I0319 09:33:59.534553 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="kube-rbac-proxy-web" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: E0319 09:33:59.534569 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="kube-rbac-proxy-metric" Mar 19 09:33:59.534696 master-0 kubenswrapper[13205]: I0319 09:33:59.534579 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="kube-rbac-proxy-metric" Mar 19 09:33:59.535321 master-0 kubenswrapper[13205]: I0319 09:33:59.534767 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 09:33:59.535321 master-0 kubenswrapper[13205]: I0319 09:33:59.534790 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="kube-rbac-proxy" Mar 19 09:33:59.535321 master-0 kubenswrapper[13205]: I0319 09:33:59.534815 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="prom-label-proxy" Mar 19 09:33:59.535321 master-0 kubenswrapper[13205]: I0319 09:33:59.534831 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="alertmanager" Mar 19 09:33:59.535321 master-0 kubenswrapper[13205]: I0319 09:33:59.534850 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="kube-rbac-proxy-metric" Mar 19 09:33:59.535321 master-0 kubenswrapper[13205]: I0319 09:33:59.534868 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="config-reloader" Mar 19 09:33:59.535321 master-0 kubenswrapper[13205]: I0319 09:33:59.534882 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" Mar 19 09:33:59.535321 master-0 kubenswrapper[13205]: I0319 09:33:59.534902 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" Mar 19 09:33:59.535321 master-0 kubenswrapper[13205]: I0319 09:33:59.534920 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-recovery-controller" Mar 19 09:33:59.535321 master-0 kubenswrapper[13205]: I0319 09:33:59.534936 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d9f4bd-97d8-42be-b5a7-0c8cbf45350b" containerName="kube-rbac-proxy-web" Mar 19 09:33:59.535321 master-0 kubenswrapper[13205]: I0319 09:33:59.535321 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 09:33:59.631895 master-0 kubenswrapper[13205]: I0319 09:33:59.631822 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:33:59.632137 master-0 kubenswrapper[13205]: I0319 09:33:59.631971 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:33:59.635674 master-0 kubenswrapper[13205]: I0319 09:33:59.635632 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 19 09:33:59.635877 master-0 kubenswrapper[13205]: I0319 09:33:59.635821 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-5-master-0" podUID="621a61b9-69f0-4bbe-ae33-56a4473c72ee" containerName="installer" containerID="cri-o://e7bf368fcae180a0ba7541554d44ad054ecd176c57a65be6c28b9187d83dd5f6" gracePeriod=30 Mar 19 09:33:59.639346 master-0 kubenswrapper[13205]: I0319 09:33:59.639285 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8413125cf444e5c95f023c5dd9c6151e" podUID="8e27b7d086edf5d2cf47b703574641d8" Mar 19 09:33:59.730894 master-0 kubenswrapper[13205]: I0319 09:33:59.730829 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/1.log" Mar 19 09:33:59.732333 master-0 kubenswrapper[13205]: I0319 09:33:59.732301 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/0.log" Mar 19 09:33:59.732645 master-0 kubenswrapper[13205]: I0319 09:33:59.732482 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:33:59.732645 master-0 kubenswrapper[13205]: I0319 09:33:59.732564 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:33:59.732645 master-0 kubenswrapper[13205]: I0319 09:33:59.732585 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:33:59.732645 master-0 kubenswrapper[13205]: I0319 09:33:59.732646 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:33:59.733085 master-0 kubenswrapper[13205]: I0319 09:33:59.732866 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler/0.log" Mar 19 09:33:59.733315 master-0 kubenswrapper[13205]: I0319 09:33:59.733287 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:33:59.736354 master-0 kubenswrapper[13205]: I0319 09:33:59.736304 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8413125cf444e5c95f023c5dd9c6151e" podUID="8e27b7d086edf5d2cf47b703574641d8" Mar 19 09:33:59.845501 master-0 kubenswrapper[13205]: I0319 09:33:59.845198 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"8413125cf444e5c95f023c5dd9c6151e\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " Mar 19 09:33:59.845501 master-0 kubenswrapper[13205]: I0319 09:33:59.845330 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"8413125cf444e5c95f023c5dd9c6151e\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " Mar 19 09:33:59.845884 master-0 kubenswrapper[13205]: I0319 09:33:59.845761 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "8413125cf444e5c95f023c5dd9c6151e" (UID: "8413125cf444e5c95f023c5dd9c6151e"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:33:59.845884 master-0 kubenswrapper[13205]: I0319 09:33:59.845797 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8413125cf444e5c95f023c5dd9c6151e" (UID: "8413125cf444e5c95f023c5dd9c6151e"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:33:59.846217 master-0 kubenswrapper[13205]: I0319 09:33:59.846158 13205 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:33:59.846361 master-0 kubenswrapper[13205]: I0319 09:33:59.846225 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:00.063362 master-0 kubenswrapper[13205]: I0319 09:34:00.059229 13205 generic.go:334] "Generic (PLEG): container finished" podID="621a61b9-69f0-4bbe-ae33-56a4473c72ee" containerID="e7bf368fcae180a0ba7541554d44ad054ecd176c57a65be6c28b9187d83dd5f6" exitCode=0 Mar 19 09:34:00.063362 master-0 kubenswrapper[13205]: I0319 09:34:00.059331 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"621a61b9-69f0-4bbe-ae33-56a4473c72ee","Type":"ContainerDied","Data":"e7bf368fcae180a0ba7541554d44ad054ecd176c57a65be6c28b9187d83dd5f6"} Mar 19 09:34:00.063751 master-0 kubenswrapper[13205]: I0319 09:34:00.063605 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/1.log" Mar 19 09:34:00.064357 master-0 kubenswrapper[13205]: I0319 09:34:00.064330 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:34:00.069266 master-0 kubenswrapper[13205]: I0319 09:34:00.069232 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/0.log" Mar 19 09:34:00.070208 master-0 kubenswrapper[13205]: I0319 09:34:00.070173 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler/0.log" Mar 19 09:34:00.070740 master-0 kubenswrapper[13205]: I0319 09:34:00.070703 13205 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="96f501d33ba99906fcc67f343ffb6c0314c555d3c6113a843511ffa7ed7f311a" exitCode=2 Mar 19 09:34:00.070740 master-0 kubenswrapper[13205]: I0319 09:34:00.070728 13205 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="0a17e7848d06038a69e2540781de2a324d8067bd69c0598df08e190c706b5066" exitCode=0 Mar 19 09:34:00.070740 master-0 kubenswrapper[13205]: I0319 09:34:00.070735 13205 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="793cfb93f2346e0ad23e32cbd1e114aae92c03db2ff0726f899f8a1c39d66416" exitCode=0 Mar 19 09:34:00.070904 master-0 kubenswrapper[13205]: I0319 09:34:00.070769 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6cda39585354e47346ec04d7e9023161d8c669dfe02492069483d076fdb9801" Mar 19 09:34:00.070904 master-0 kubenswrapper[13205]: I0319 09:34:00.070784 13205 scope.go:117] "RemoveContainer" containerID="6d8e777ee2c690477b890e212d15377f6f78a023a47f6d1ccdb66d4fd4236c20" Mar 19 09:34:00.070904 master-0 kubenswrapper[13205]: I0319 09:34:00.070856 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:00.085978 master-0 kubenswrapper[13205]: I0319 09:34:00.085930 13205 scope.go:117] "RemoveContainer" containerID="57919871ecdce20adcf14d4b3e782688c40e27d380e27e5683da1cfdca89a184" Mar 19 09:34:00.092869 master-0 kubenswrapper[13205]: I0319 09:34:00.092806 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8413125cf444e5c95f023c5dd9c6151e" podUID="8e27b7d086edf5d2cf47b703574641d8" Mar 19 09:34:00.116497 master-0 kubenswrapper[13205]: I0319 09:34:00.116404 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8413125cf444e5c95f023c5dd9c6151e" podUID="8e27b7d086edf5d2cf47b703574641d8" Mar 19 09:34:00.151025 master-0 kubenswrapper[13205]: I0319 09:34:00.150954 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/621a61b9-69f0-4bbe-ae33-56a4473c72ee-var-lock\") pod \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\" (UID: \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\") " Mar 19 09:34:00.151220 master-0 kubenswrapper[13205]: I0319 09:34:00.151046 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/621a61b9-69f0-4bbe-ae33-56a4473c72ee-kube-api-access\") pod \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\" (UID: \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\") " Mar 19 09:34:00.151220 master-0 kubenswrapper[13205]: I0319 09:34:00.151099 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/621a61b9-69f0-4bbe-ae33-56a4473c72ee-kubelet-dir\") pod \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\" (UID: \"621a61b9-69f0-4bbe-ae33-56a4473c72ee\") " Mar 19 09:34:00.151220 master-0 kubenswrapper[13205]: I0319 09:34:00.151134 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/621a61b9-69f0-4bbe-ae33-56a4473c72ee-var-lock" (OuterVolumeSpecName: "var-lock") pod "621a61b9-69f0-4bbe-ae33-56a4473c72ee" (UID: "621a61b9-69f0-4bbe-ae33-56a4473c72ee"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:00.151414 master-0 kubenswrapper[13205]: I0319 09:34:00.151378 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/621a61b9-69f0-4bbe-ae33-56a4473c72ee-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:00.151621 master-0 kubenswrapper[13205]: I0319 09:34:00.151579 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/621a61b9-69f0-4bbe-ae33-56a4473c72ee-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "621a61b9-69f0-4bbe-ae33-56a4473c72ee" (UID: "621a61b9-69f0-4bbe-ae33-56a4473c72ee"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:00.153944 master-0 kubenswrapper[13205]: I0319 09:34:00.153882 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/621a61b9-69f0-4bbe-ae33-56a4473c72ee-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "621a61b9-69f0-4bbe-ae33-56a4473c72ee" (UID: "621a61b9-69f0-4bbe-ae33-56a4473c72ee"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:34:00.252683 master-0 kubenswrapper[13205]: I0319 09:34:00.252638 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/621a61b9-69f0-4bbe-ae33-56a4473c72ee-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:00.252916 master-0 kubenswrapper[13205]: I0319 09:34:00.252901 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/621a61b9-69f0-4bbe-ae33-56a4473c72ee-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:00.857728 master-0 kubenswrapper[13205]: I0319 09:34:00.857502 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8413125cf444e5c95f023c5dd9c6151e" path="/var/lib/kubelet/pods/8413125cf444e5c95f023c5dd9c6151e/volumes" Mar 19 09:34:01.079720 master-0 kubenswrapper[13205]: I0319 09:34:01.079653 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/1.log" Mar 19 09:34:01.081865 master-0 kubenswrapper[13205]: I0319 09:34:01.081830 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"621a61b9-69f0-4bbe-ae33-56a4473c72ee","Type":"ContainerDied","Data":"4afcbd4532566468202978264189558122552ec1de0de3ab5ebc6b167bd4e785"} Mar 19 09:34:01.081934 master-0 kubenswrapper[13205]: I0319 09:34:01.081876 13205 scope.go:117] "RemoveContainer" containerID="e7bf368fcae180a0ba7541554d44ad054ecd176c57a65be6c28b9187d83dd5f6" Mar 19 09:34:01.081969 master-0 kubenswrapper[13205]: I0319 09:34:01.081957 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:34:01.106487 master-0 kubenswrapper[13205]: I0319 09:34:01.106445 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 19 09:34:01.110374 master-0 kubenswrapper[13205]: I0319 09:34:01.110290 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 19 09:34:01.831354 master-0 kubenswrapper[13205]: I0319 09:34:01.831305 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Mar 19 09:34:01.831612 master-0 kubenswrapper[13205]: E0319 09:34:01.831554 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621a61b9-69f0-4bbe-ae33-56a4473c72ee" containerName="installer" Mar 19 09:34:01.831612 master-0 kubenswrapper[13205]: I0319 09:34:01.831565 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="621a61b9-69f0-4bbe-ae33-56a4473c72ee" containerName="installer" Mar 19 09:34:01.831692 master-0 kubenswrapper[13205]: I0319 09:34:01.831685 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="621a61b9-69f0-4bbe-ae33-56a4473c72ee" containerName="installer" Mar 19 09:34:01.832085 master-0 kubenswrapper[13205]: I0319 09:34:01.832064 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:01.833740 master-0 kubenswrapper[13205]: I0319 09:34:01.833705 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 09:34:01.834340 master-0 kubenswrapper[13205]: I0319 09:34:01.834298 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-xzz4b" Mar 19 09:34:01.853447 master-0 kubenswrapper[13205]: I0319 09:34:01.853392 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Mar 19 09:34:01.875810 master-0 kubenswrapper[13205]: I0319 09:34:01.875744 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1c20f3b-cd10-4eac-88d8-70db61994bc2-kube-api-access\") pod \"installer-6-master-0\" (UID: \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:01.876346 master-0 kubenswrapper[13205]: I0319 09:34:01.875930 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1c20f3b-cd10-4eac-88d8-70db61994bc2-var-lock\") pod \"installer-6-master-0\" (UID: \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:01.876346 master-0 kubenswrapper[13205]: I0319 09:34:01.876143 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1c20f3b-cd10-4eac-88d8-70db61994bc2-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:01.977320 master-0 kubenswrapper[13205]: I0319 09:34:01.977247 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1c20f3b-cd10-4eac-88d8-70db61994bc2-kube-api-access\") pod \"installer-6-master-0\" (UID: \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:01.977320 master-0 kubenswrapper[13205]: I0319 09:34:01.977305 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1c20f3b-cd10-4eac-88d8-70db61994bc2-var-lock\") pod \"installer-6-master-0\" (UID: \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:01.977614 master-0 kubenswrapper[13205]: I0319 09:34:01.977350 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1c20f3b-cd10-4eac-88d8-70db61994bc2-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:01.977614 master-0 kubenswrapper[13205]: I0319 09:34:01.977437 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1c20f3b-cd10-4eac-88d8-70db61994bc2-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:01.977614 master-0 kubenswrapper[13205]: I0319 09:34:01.977476 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1c20f3b-cd10-4eac-88d8-70db61994bc2-var-lock\") pod \"installer-6-master-0\" (UID: \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:01.998586 master-0 kubenswrapper[13205]: I0319 09:34:01.998463 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1c20f3b-cd10-4eac-88d8-70db61994bc2-kube-api-access\") pod \"installer-6-master-0\" (UID: \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:02.165073 master-0 kubenswrapper[13205]: I0319 09:34:02.164923 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:02.575416 master-0 kubenswrapper[13205]: I0319 09:34:02.575343 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Mar 19 09:34:02.857312 master-0 kubenswrapper[13205]: I0319 09:34:02.857237 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="621a61b9-69f0-4bbe-ae33-56a4473c72ee" path="/var/lib/kubelet/pods/621a61b9-69f0-4bbe-ae33-56a4473c72ee/volumes" Mar 19 09:34:03.099748 master-0 kubenswrapper[13205]: I0319 09:34:03.099618 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"d1c20f3b-cd10-4eac-88d8-70db61994bc2","Type":"ContainerStarted","Data":"c7843ba93e04bff51321b8359a36b5530e83a8a395053571f10313a5c1e683f4"} Mar 19 09:34:03.099748 master-0 kubenswrapper[13205]: I0319 09:34:03.099665 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"d1c20f3b-cd10-4eac-88d8-70db61994bc2","Type":"ContainerStarted","Data":"936eba05bcac3121ff6e515dec952de8a1921b858d6cf95cb1b8dcb3c3886f59"} Mar 19 09:34:03.120686 master-0 kubenswrapper[13205]: I0319 09:34:03.120589 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-6-master-0" podStartSLOduration=2.1205669719999998 podStartE2EDuration="2.120566972s" podCreationTimestamp="2026-03-19 09:34:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:34:03.117418004 +0000 UTC m=+628.449724912" watchObservedRunningTime="2026-03-19 09:34:03.120566972 +0000 UTC m=+628.452873890" Mar 19 09:34:04.001009 master-0 kubenswrapper[13205]: I0319 09:34:04.000916 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-3-master-0_f7b5036d-9738-4e7e-a11f-ed64194ea30f/installer/0.log" Mar 19 09:34:04.001009 master-0 kubenswrapper[13205]: I0319 09:34:04.000995 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:34:04.107907 master-0 kubenswrapper[13205]: I0319 09:34:04.107853 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-3-master-0_f7b5036d-9738-4e7e-a11f-ed64194ea30f/installer/0.log" Mar 19 09:34:04.108380 master-0 kubenswrapper[13205]: I0319 09:34:04.107924 13205 generic.go:334] "Generic (PLEG): container finished" podID="f7b5036d-9738-4e7e-a11f-ed64194ea30f" containerID="ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8" exitCode=1 Mar 19 09:34:04.108380 master-0 kubenswrapper[13205]: I0319 09:34:04.107997 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:34:04.108380 master-0 kubenswrapper[13205]: I0319 09:34:04.108017 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"f7b5036d-9738-4e7e-a11f-ed64194ea30f","Type":"ContainerDied","Data":"ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8"} Mar 19 09:34:04.108380 master-0 kubenswrapper[13205]: I0319 09:34:04.108080 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"f7b5036d-9738-4e7e-a11f-ed64194ea30f","Type":"ContainerDied","Data":"f9221f0cda6995023043a3926b6b04edb9f0474b37a79ab4819d0f125bcb4d0e"} Mar 19 09:34:04.108380 master-0 kubenswrapper[13205]: I0319 09:34:04.108118 13205 scope.go:117] "RemoveContainer" containerID="ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8" Mar 19 09:34:04.124608 master-0 kubenswrapper[13205]: I0319 09:34:04.124517 13205 scope.go:117] "RemoveContainer" containerID="ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8" Mar 19 09:34:04.125336 master-0 kubenswrapper[13205]: E0319 09:34:04.125220 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8\": container with ID starting with ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8 not found: ID does not exist" containerID="ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8" Mar 19 09:34:04.125336 master-0 kubenswrapper[13205]: I0319 09:34:04.125273 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8"} err="failed to get container status \"ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8\": rpc error: code = NotFound desc = could not find container \"ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8\": container with ID starting with ab1358487b39ab4ad0a5ec486d45cb29dd67a19253b07b344b08396d420d24a8 not found: ID does not exist" Mar 19 09:34:04.162997 master-0 kubenswrapper[13205]: I0319 09:34:04.162948 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7b5036d-9738-4e7e-a11f-ed64194ea30f-kube-api-access\") pod \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\" (UID: \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\") " Mar 19 09:34:04.162997 master-0 kubenswrapper[13205]: I0319 09:34:04.162997 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7b5036d-9738-4e7e-a11f-ed64194ea30f-kubelet-dir\") pod \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\" (UID: \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\") " Mar 19 09:34:04.163903 master-0 kubenswrapper[13205]: I0319 09:34:04.163077 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7b5036d-9738-4e7e-a11f-ed64194ea30f-var-lock\") pod \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\" (UID: \"f7b5036d-9738-4e7e-a11f-ed64194ea30f\") " Mar 19 09:34:04.163903 master-0 kubenswrapper[13205]: I0319 09:34:04.163071 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7b5036d-9738-4e7e-a11f-ed64194ea30f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f7b5036d-9738-4e7e-a11f-ed64194ea30f" (UID: "f7b5036d-9738-4e7e-a11f-ed64194ea30f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:04.163903 master-0 kubenswrapper[13205]: I0319 09:34:04.163194 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7b5036d-9738-4e7e-a11f-ed64194ea30f-var-lock" (OuterVolumeSpecName: "var-lock") pod "f7b5036d-9738-4e7e-a11f-ed64194ea30f" (UID: "f7b5036d-9738-4e7e-a11f-ed64194ea30f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:04.163903 master-0 kubenswrapper[13205]: I0319 09:34:04.163562 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7b5036d-9738-4e7e-a11f-ed64194ea30f-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:04.163903 master-0 kubenswrapper[13205]: I0319 09:34:04.163582 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7b5036d-9738-4e7e-a11f-ed64194ea30f-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:04.166643 master-0 kubenswrapper[13205]: I0319 09:34:04.166481 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b5036d-9738-4e7e-a11f-ed64194ea30f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f7b5036d-9738-4e7e-a11f-ed64194ea30f" (UID: "f7b5036d-9738-4e7e-a11f-ed64194ea30f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:34:04.265498 master-0 kubenswrapper[13205]: I0319 09:34:04.265431 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7b5036d-9738-4e7e-a11f-ed64194ea30f-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:04.465359 master-0 kubenswrapper[13205]: I0319 09:34:04.465273 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 09:34:04.472302 master-0 kubenswrapper[13205]: I0319 09:34:04.472233 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 09:34:04.875458 master-0 kubenswrapper[13205]: I0319 09:34:04.875374 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b5036d-9738-4e7e-a11f-ed64194ea30f" path="/var/lib/kubelet/pods/f7b5036d-9738-4e7e-a11f-ed64194ea30f/volumes" Mar 19 09:34:05.087213 master-0 kubenswrapper[13205]: I0319 09:34:05.087094 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 19 09:34:05.087702 master-0 kubenswrapper[13205]: I0319 09:34:05.087650 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-4-master-0" podUID="3cb82068-9d79-4917-88fd-07cd7a9adbb4" containerName="installer" containerID="cri-o://2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb" gracePeriod=30 Mar 19 09:34:05.543825 master-0 kubenswrapper[13205]: I0319 09:34:05.543765 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-4-master-0_3cb82068-9d79-4917-88fd-07cd7a9adbb4/installer/0.log" Mar 19 09:34:05.544258 master-0 kubenswrapper[13205]: I0319 09:34:05.543864 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:34:05.688959 master-0 kubenswrapper[13205]: I0319 09:34:05.688879 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3cb82068-9d79-4917-88fd-07cd7a9adbb4-kube-api-access\") pod \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\" (UID: \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\") " Mar 19 09:34:05.689418 master-0 kubenswrapper[13205]: I0319 09:34:05.689379 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3cb82068-9d79-4917-88fd-07cd7a9adbb4-var-lock\") pod \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\" (UID: \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\") " Mar 19 09:34:05.689578 master-0 kubenswrapper[13205]: I0319 09:34:05.689492 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3cb82068-9d79-4917-88fd-07cd7a9adbb4-kubelet-dir\") pod \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\" (UID: \"3cb82068-9d79-4917-88fd-07cd7a9adbb4\") " Mar 19 09:34:05.689661 master-0 kubenswrapper[13205]: I0319 09:34:05.689587 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cb82068-9d79-4917-88fd-07cd7a9adbb4-var-lock" (OuterVolumeSpecName: "var-lock") pod "3cb82068-9d79-4917-88fd-07cd7a9adbb4" (UID: "3cb82068-9d79-4917-88fd-07cd7a9adbb4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:05.689762 master-0 kubenswrapper[13205]: I0319 09:34:05.689734 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cb82068-9d79-4917-88fd-07cd7a9adbb4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3cb82068-9d79-4917-88fd-07cd7a9adbb4" (UID: "3cb82068-9d79-4917-88fd-07cd7a9adbb4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:05.689937 master-0 kubenswrapper[13205]: I0319 09:34:05.689898 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3cb82068-9d79-4917-88fd-07cd7a9adbb4-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:05.690019 master-0 kubenswrapper[13205]: I0319 09:34:05.689949 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3cb82068-9d79-4917-88fd-07cd7a9adbb4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:05.691825 master-0 kubenswrapper[13205]: I0319 09:34:05.691781 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb82068-9d79-4917-88fd-07cd7a9adbb4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3cb82068-9d79-4917-88fd-07cd7a9adbb4" (UID: "3cb82068-9d79-4917-88fd-07cd7a9adbb4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:34:05.791607 master-0 kubenswrapper[13205]: I0319 09:34:05.791496 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3cb82068-9d79-4917-88fd-07cd7a9adbb4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:06.131501 master-0 kubenswrapper[13205]: I0319 09:34:06.131346 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-4-master-0_3cb82068-9d79-4917-88fd-07cd7a9adbb4/installer/0.log" Mar 19 09:34:06.131501 master-0 kubenswrapper[13205]: I0319 09:34:06.131421 13205 generic.go:334] "Generic (PLEG): container finished" podID="3cb82068-9d79-4917-88fd-07cd7a9adbb4" containerID="2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb" exitCode=1 Mar 19 09:34:06.131501 master-0 kubenswrapper[13205]: I0319 09:34:06.131463 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"3cb82068-9d79-4917-88fd-07cd7a9adbb4","Type":"ContainerDied","Data":"2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb"} Mar 19 09:34:06.131959 master-0 kubenswrapper[13205]: I0319 09:34:06.131522 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"3cb82068-9d79-4917-88fd-07cd7a9adbb4","Type":"ContainerDied","Data":"20b9c18b57eee9cebcb0a968ade99e915a2d56bf0568ccfa90cc604f27af0e06"} Mar 19 09:34:06.131959 master-0 kubenswrapper[13205]: I0319 09:34:06.131570 13205 scope.go:117] "RemoveContainer" containerID="2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb" Mar 19 09:34:06.131959 master-0 kubenswrapper[13205]: I0319 09:34:06.131788 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 09:34:06.158072 master-0 kubenswrapper[13205]: I0319 09:34:06.158036 13205 scope.go:117] "RemoveContainer" containerID="2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb" Mar 19 09:34:06.159942 master-0 kubenswrapper[13205]: E0319 09:34:06.159884 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb\": container with ID starting with 2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb not found: ID does not exist" containerID="2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb" Mar 19 09:34:06.160026 master-0 kubenswrapper[13205]: I0319 09:34:06.159953 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb"} err="failed to get container status \"2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb\": rpc error: code = NotFound desc = could not find container \"2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb\": container with ID starting with 2de8f471914186c6507f6bc7edd978ad739f75022d57d43ddc21531ca49910eb not found: ID does not exist" Mar 19 09:34:06.166066 master-0 kubenswrapper[13205]: I0319 09:34:06.165995 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 19 09:34:06.174654 master-0 kubenswrapper[13205]: I0319 09:34:06.174587 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 19 09:34:06.857801 master-0 kubenswrapper[13205]: I0319 09:34:06.857745 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb82068-9d79-4917-88fd-07cd7a9adbb4" path="/var/lib/kubelet/pods/3cb82068-9d79-4917-88fd-07cd7a9adbb4/volumes" Mar 19 09:34:07.144314 master-0 kubenswrapper[13205]: I0319 09:34:07.144254 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:34:07.144314 master-0 kubenswrapper[13205]: I0319 09:34:07.144317 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:34:07.292897 master-0 kubenswrapper[13205]: I0319 09:34:07.292806 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 19 09:34:07.293232 master-0 kubenswrapper[13205]: E0319 09:34:07.293160 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb82068-9d79-4917-88fd-07cd7a9adbb4" containerName="installer" Mar 19 09:34:07.293232 master-0 kubenswrapper[13205]: I0319 09:34:07.293178 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb82068-9d79-4917-88fd-07cd7a9adbb4" containerName="installer" Mar 19 09:34:07.293232 master-0 kubenswrapper[13205]: E0319 09:34:07.293198 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b5036d-9738-4e7e-a11f-ed64194ea30f" containerName="installer" Mar 19 09:34:07.293232 master-0 kubenswrapper[13205]: I0319 09:34:07.293206 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b5036d-9738-4e7e-a11f-ed64194ea30f" containerName="installer" Mar 19 09:34:07.293595 master-0 kubenswrapper[13205]: I0319 09:34:07.293392 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb82068-9d79-4917-88fd-07cd7a9adbb4" containerName="installer" Mar 19 09:34:07.293595 master-0 kubenswrapper[13205]: I0319 09:34:07.293418 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b5036d-9738-4e7e-a11f-ed64194ea30f" containerName="installer" Mar 19 09:34:07.293945 master-0 kubenswrapper[13205]: I0319 09:34:07.293903 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:07.296437 master-0 kubenswrapper[13205]: I0319 09:34:07.296358 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 09:34:07.296667 master-0 kubenswrapper[13205]: I0319 09:34:07.296610 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-nm2j7" Mar 19 09:34:07.299944 master-0 kubenswrapper[13205]: I0319 09:34:07.299876 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 19 09:34:07.414408 master-0 kubenswrapper[13205]: I0319 09:34:07.414251 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/042705f9-eeff-4d51-808d-6da4be0720d3-var-lock\") pod \"installer-5-master-0\" (UID: \"042705f9-eeff-4d51-808d-6da4be0720d3\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:07.414632 master-0 kubenswrapper[13205]: I0319 09:34:07.414583 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/042705f9-eeff-4d51-808d-6da4be0720d3-kube-api-access\") pod \"installer-5-master-0\" (UID: \"042705f9-eeff-4d51-808d-6da4be0720d3\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:07.414680 master-0 kubenswrapper[13205]: I0319 09:34:07.414648 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/042705f9-eeff-4d51-808d-6da4be0720d3-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"042705f9-eeff-4d51-808d-6da4be0720d3\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:07.516186 master-0 kubenswrapper[13205]: I0319 09:34:07.516113 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/042705f9-eeff-4d51-808d-6da4be0720d3-var-lock\") pod \"installer-5-master-0\" (UID: \"042705f9-eeff-4d51-808d-6da4be0720d3\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:07.516461 master-0 kubenswrapper[13205]: I0319 09:34:07.516266 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/042705f9-eeff-4d51-808d-6da4be0720d3-kube-api-access\") pod \"installer-5-master-0\" (UID: \"042705f9-eeff-4d51-808d-6da4be0720d3\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:07.516461 master-0 kubenswrapper[13205]: I0319 09:34:07.516280 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/042705f9-eeff-4d51-808d-6da4be0720d3-var-lock\") pod \"installer-5-master-0\" (UID: \"042705f9-eeff-4d51-808d-6da4be0720d3\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:07.516461 master-0 kubenswrapper[13205]: I0319 09:34:07.516306 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/042705f9-eeff-4d51-808d-6da4be0720d3-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"042705f9-eeff-4d51-808d-6da4be0720d3\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:07.516461 master-0 kubenswrapper[13205]: I0319 09:34:07.516360 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/042705f9-eeff-4d51-808d-6da4be0720d3-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"042705f9-eeff-4d51-808d-6da4be0720d3\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:07.534910 master-0 kubenswrapper[13205]: I0319 09:34:07.534847 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/042705f9-eeff-4d51-808d-6da4be0720d3-kube-api-access\") pod \"installer-5-master-0\" (UID: \"042705f9-eeff-4d51-808d-6da4be0720d3\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:07.638032 master-0 kubenswrapper[13205]: I0319 09:34:07.637946 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:08.028522 master-0 kubenswrapper[13205]: I0319 09:34:08.028449 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 19 09:34:08.147453 master-0 kubenswrapper[13205]: I0319 09:34:08.147398 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"042705f9-eeff-4d51-808d-6da4be0720d3","Type":"ContainerStarted","Data":"69f4f783fc8ad486f7e7dc475fd1bf648e6ee7f761f2256c8685ccba9ae33d2c"} Mar 19 09:34:09.153732 master-0 kubenswrapper[13205]: I0319 09:34:09.153688 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"042705f9-eeff-4d51-808d-6da4be0720d3","Type":"ContainerStarted","Data":"807f743c51c5f0aa8dd41cf4470dc35b029874edbfece3999d20b141b8602497"} Mar 19 09:34:09.168406 master-0 kubenswrapper[13205]: I0319 09:34:09.168320 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-5-master-0" podStartSLOduration=2.168300085 podStartE2EDuration="2.168300085s" podCreationTimestamp="2026-03-19 09:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:34:09.167269809 +0000 UTC m=+634.499576717" watchObservedRunningTime="2026-03-19 09:34:09.168300085 +0000 UTC m=+634.500606993" Mar 19 09:34:10.848996 master-0 kubenswrapper[13205]: I0319 09:34:10.848910 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:10.866988 master-0 kubenswrapper[13205]: I0319 09:34:10.866934 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="6069e006-17f8-4a91-b37f-260e8bb3e4c7" Mar 19 09:34:10.866988 master-0 kubenswrapper[13205]: I0319 09:34:10.866974 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="6069e006-17f8-4a91-b37f-260e8bb3e4c7" Mar 19 09:34:10.885770 master-0 kubenswrapper[13205]: I0319 09:34:10.885698 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:34:10.889370 master-0 kubenswrapper[13205]: I0319 09:34:10.889308 13205 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:10.901719 master-0 kubenswrapper[13205]: I0319 09:34:10.901640 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:34:10.905907 master-0 kubenswrapper[13205]: I0319 09:34:10.905851 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:10.908714 master-0 kubenswrapper[13205]: I0319 09:34:10.908670 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:34:10.944543 master-0 kubenswrapper[13205]: W0319 09:34:10.944464 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e27b7d086edf5d2cf47b703574641d8.slice/crio-310b59e18579147352422c9e3d3401a2e5954a4c0b731fa394515b931e80f20a WatchSource:0}: Error finding container 310b59e18579147352422c9e3d3401a2e5954a4c0b731fa394515b931e80f20a: Status 404 returned error can't find the container with id 310b59e18579147352422c9e3d3401a2e5954a4c0b731fa394515b931e80f20a Mar 19 09:34:11.170743 master-0 kubenswrapper[13205]: I0319 09:34:11.170683 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"310b59e18579147352422c9e3d3401a2e5954a4c0b731fa394515b931e80f20a"} Mar 19 09:34:12.182402 master-0 kubenswrapper[13205]: I0319 09:34:12.182335 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"0285ae72e28c1ba11c6c5f6c02e25350cf91b897190ff778fd152d801bc77680"} Mar 19 09:34:12.431019 master-0 kubenswrapper[13205]: I0319 09:34:12.430873 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 09:34:12.431867 master-0 kubenswrapper[13205]: I0319 09:34:12.431795 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-5-master-0" podUID="85810df1-4989-449a-8da0-192c8720d5f4" containerName="installer" containerID="cri-o://c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8" gracePeriod=30 Mar 19 09:34:15.830616 master-0 kubenswrapper[13205]: I0319 09:34:15.830430 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Mar 19 09:34:15.833592 master-0 kubenswrapper[13205]: I0319 09:34:15.832767 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:15.846973 master-0 kubenswrapper[13205]: I0319 09:34:15.846861 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Mar 19 09:34:15.897888 master-0 kubenswrapper[13205]: I0319 09:34:15.897820 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_85810df1-4989-449a-8da0-192c8720d5f4/installer/0.log" Mar 19 09:34:15.898063 master-0 kubenswrapper[13205]: I0319 09:34:15.897929 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:34:15.955653 master-0 kubenswrapper[13205]: I0319 09:34:15.955563 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:15.955876 master-0 kubenswrapper[13205]: I0319 09:34:15.955723 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-var-lock\") pod \"installer-6-master-0\" (UID: \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:15.955876 master-0 kubenswrapper[13205]: I0319 09:34:15.955807 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-kube-api-access\") pod \"installer-6-master-0\" (UID: \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:16.056615 master-0 kubenswrapper[13205]: I0319 09:34:16.056550 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85810df1-4989-449a-8da0-192c8720d5f4-kubelet-dir\") pod \"85810df1-4989-449a-8da0-192c8720d5f4\" (UID: \"85810df1-4989-449a-8da0-192c8720d5f4\") " Mar 19 09:34:16.056615 master-0 kubenswrapper[13205]: I0319 09:34:16.056621 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85810df1-4989-449a-8da0-192c8720d5f4-var-lock\") pod \"85810df1-4989-449a-8da0-192c8720d5f4\" (UID: \"85810df1-4989-449a-8da0-192c8720d5f4\") " Mar 19 09:34:16.056865 master-0 kubenswrapper[13205]: I0319 09:34:16.056689 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85810df1-4989-449a-8da0-192c8720d5f4-kube-api-access\") pod \"85810df1-4989-449a-8da0-192c8720d5f4\" (UID: \"85810df1-4989-449a-8da0-192c8720d5f4\") " Mar 19 09:34:16.056865 master-0 kubenswrapper[13205]: I0319 09:34:16.056760 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85810df1-4989-449a-8da0-192c8720d5f4-var-lock" (OuterVolumeSpecName: "var-lock") pod "85810df1-4989-449a-8da0-192c8720d5f4" (UID: "85810df1-4989-449a-8da0-192c8720d5f4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:16.056865 master-0 kubenswrapper[13205]: I0319 09:34:16.056850 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85810df1-4989-449a-8da0-192c8720d5f4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "85810df1-4989-449a-8da0-192c8720d5f4" (UID: "85810df1-4989-449a-8da0-192c8720d5f4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:16.057060 master-0 kubenswrapper[13205]: I0319 09:34:16.057024 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:16.057121 master-0 kubenswrapper[13205]: I0319 09:34:16.057100 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-var-lock\") pod \"installer-6-master-0\" (UID: \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:16.057175 master-0 kubenswrapper[13205]: I0319 09:34:16.057152 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:16.057212 master-0 kubenswrapper[13205]: I0319 09:34:16.057168 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-kube-api-access\") pod \"installer-6-master-0\" (UID: \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:16.057332 master-0 kubenswrapper[13205]: I0319 09:34:16.057311 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85810df1-4989-449a-8da0-192c8720d5f4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:16.057367 master-0 kubenswrapper[13205]: I0319 09:34:16.057334 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85810df1-4989-449a-8da0-192c8720d5f4-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:16.057400 master-0 kubenswrapper[13205]: I0319 09:34:16.057369 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-var-lock\") pod \"installer-6-master-0\" (UID: \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:16.061837 master-0 kubenswrapper[13205]: I0319 09:34:16.061786 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85810df1-4989-449a-8da0-192c8720d5f4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "85810df1-4989-449a-8da0-192c8720d5f4" (UID: "85810df1-4989-449a-8da0-192c8720d5f4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:34:16.081129 master-0 kubenswrapper[13205]: I0319 09:34:16.081003 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-kube-api-access\") pod \"installer-6-master-0\" (UID: \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:16.158794 master-0 kubenswrapper[13205]: I0319 09:34:16.158732 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85810df1-4989-449a-8da0-192c8720d5f4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:16.191416 master-0 kubenswrapper[13205]: I0319 09:34:16.191352 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:16.221288 master-0 kubenswrapper[13205]: I0319 09:34:16.221247 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_85810df1-4989-449a-8da0-192c8720d5f4/installer/0.log" Mar 19 09:34:16.221414 master-0 kubenswrapper[13205]: I0319 09:34:16.221328 13205 generic.go:334] "Generic (PLEG): container finished" podID="85810df1-4989-449a-8da0-192c8720d5f4" containerID="c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8" exitCode=1 Mar 19 09:34:16.221414 master-0 kubenswrapper[13205]: I0319 09:34:16.221369 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"85810df1-4989-449a-8da0-192c8720d5f4","Type":"ContainerDied","Data":"c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8"} Mar 19 09:34:16.221488 master-0 kubenswrapper[13205]: I0319 09:34:16.221411 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"85810df1-4989-449a-8da0-192c8720d5f4","Type":"ContainerDied","Data":"aa07c08eded51738a9a43b5549e93c3b74348793377d3b5c9d05584e1dc87795"} Mar 19 09:34:16.221488 master-0 kubenswrapper[13205]: I0319 09:34:16.221439 13205 scope.go:117] "RemoveContainer" containerID="c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8" Mar 19 09:34:16.221488 master-0 kubenswrapper[13205]: I0319 09:34:16.221458 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:34:16.267718 master-0 kubenswrapper[13205]: I0319 09:34:16.267637 13205 scope.go:117] "RemoveContainer" containerID="c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8" Mar 19 09:34:16.268139 master-0 kubenswrapper[13205]: E0319 09:34:16.268115 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8\": container with ID starting with c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8 not found: ID does not exist" containerID="c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8" Mar 19 09:34:16.268193 master-0 kubenswrapper[13205]: I0319 09:34:16.268145 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8"} err="failed to get container status \"c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8\": rpc error: code = NotFound desc = could not find container \"c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8\": container with ID starting with c89e3f465c9d5921be10291ffdcf6db6d1035d0594d3e1db251731ecae6b2aa8 not found: ID does not exist" Mar 19 09:34:16.284471 master-0 kubenswrapper[13205]: I0319 09:34:16.284103 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 09:34:16.303624 master-0 kubenswrapper[13205]: I0319 09:34:16.302713 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 09:34:16.628894 master-0 kubenswrapper[13205]: W0319 09:34:16.628841 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbc2139b7_8af8_4294_aee2_3e7429d2b1fe.slice/crio-56eab5e1b1b74c5337a671c5cae69f468938e2992f65bdf05f6eb6d21a30bc2c WatchSource:0}: Error finding container 56eab5e1b1b74c5337a671c5cae69f468938e2992f65bdf05f6eb6d21a30bc2c: Status 404 returned error can't find the container with id 56eab5e1b1b74c5337a671c5cae69f468938e2992f65bdf05f6eb6d21a30bc2c Mar 19 09:34:16.629298 master-0 kubenswrapper[13205]: I0319 09:34:16.629231 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Mar 19 09:34:16.859285 master-0 kubenswrapper[13205]: I0319 09:34:16.858974 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85810df1-4989-449a-8da0-192c8720d5f4" path="/var/lib/kubelet/pods/85810df1-4989-449a-8da0-192c8720d5f4/volumes" Mar 19 09:34:17.144071 master-0 kubenswrapper[13205]: I0319 09:34:17.143956 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:34:17.144343 master-0 kubenswrapper[13205]: I0319 09:34:17.144064 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:34:17.232930 master-0 kubenswrapper[13205]: I0319 09:34:17.232770 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"bc2139b7-8af8-4294-aee2-3e7429d2b1fe","Type":"ContainerStarted","Data":"c76824ff9aec5a410a63bff9ce52fea6c987f42ac49fcadb16ca6270f4c9c996"} Mar 19 09:34:17.232930 master-0 kubenswrapper[13205]: I0319 09:34:17.232833 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"bc2139b7-8af8-4294-aee2-3e7429d2b1fe","Type":"ContainerStarted","Data":"56eab5e1b1b74c5337a671c5cae69f468938e2992f65bdf05f6eb6d21a30bc2c"} Mar 19 09:34:17.254603 master-0 kubenswrapper[13205]: I0319 09:34:17.254475 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-6-master-0" podStartSLOduration=2.254454593 podStartE2EDuration="2.254454593s" podCreationTimestamp="2026-03-19 09:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:34:17.250441993 +0000 UTC m=+642.582748891" watchObservedRunningTime="2026-03-19 09:34:17.254454593 +0000 UTC m=+642.586761481" Mar 19 09:34:27.145229 master-0 kubenswrapper[13205]: I0319 09:34:27.145142 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:34:27.146301 master-0 kubenswrapper[13205]: I0319 09:34:27.145245 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:34:33.856692 master-0 kubenswrapper[13205]: I0319 09:34:33.856606 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:34:33.857789 master-0 kubenswrapper[13205]: I0319 09:34:33.856988 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8e27b7d086edf5d2cf47b703574641d8" containerName="wait-for-host-port" containerID="cri-o://0285ae72e28c1ba11c6c5f6c02e25350cf91b897190ff778fd152d801bc77680" gracePeriod=30 Mar 19 09:34:33.858715 master-0 kubenswrapper[13205]: I0319 09:34:33.858219 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:34:33.858829 master-0 kubenswrapper[13205]: E0319 09:34:33.858718 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e27b7d086edf5d2cf47b703574641d8" containerName="wait-for-host-port" Mar 19 09:34:33.858829 master-0 kubenswrapper[13205]: I0319 09:34:33.858749 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e27b7d086edf5d2cf47b703574641d8" containerName="wait-for-host-port" Mar 19 09:34:33.858829 master-0 kubenswrapper[13205]: E0319 09:34:33.858778 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85810df1-4989-449a-8da0-192c8720d5f4" containerName="installer" Mar 19 09:34:33.858829 master-0 kubenswrapper[13205]: I0319 09:34:33.858791 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="85810df1-4989-449a-8da0-192c8720d5f4" containerName="installer" Mar 19 09:34:33.859164 master-0 kubenswrapper[13205]: I0319 09:34:33.859007 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e27b7d086edf5d2cf47b703574641d8" containerName="wait-for-host-port" Mar 19 09:34:33.859164 master-0 kubenswrapper[13205]: I0319 09:34:33.859057 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="85810df1-4989-449a-8da0-192c8720d5f4" containerName="installer" Mar 19 09:34:33.945030 master-0 kubenswrapper[13205]: I0319 09:34:33.944943 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:33.950463 master-0 kubenswrapper[13205]: I0319 09:34:33.950362 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8e27b7d086edf5d2cf47b703574641d8" podUID="11a2f93448b9d54da9854663936e2b73" Mar 19 09:34:34.041476 master-0 kubenswrapper[13205]: I0319 09:34:34.041374 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"8e27b7d086edf5d2cf47b703574641d8\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " Mar 19 09:34:34.042005 master-0 kubenswrapper[13205]: I0319 09:34:34.041509 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"8e27b7d086edf5d2cf47b703574641d8\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " Mar 19 09:34:34.042005 master-0 kubenswrapper[13205]: I0319 09:34:34.041508 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "8e27b7d086edf5d2cf47b703574641d8" (UID: "8e27b7d086edf5d2cf47b703574641d8"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:34.042005 master-0 kubenswrapper[13205]: I0319 09:34:34.041640 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8e27b7d086edf5d2cf47b703574641d8" (UID: "8e27b7d086edf5d2cf47b703574641d8"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:34.042005 master-0 kubenswrapper[13205]: I0319 09:34:34.041967 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/11a2f93448b9d54da9854663936e2b73-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"11a2f93448b9d54da9854663936e2b73\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:34.042466 master-0 kubenswrapper[13205]: I0319 09:34:34.042136 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/11a2f93448b9d54da9854663936e2b73-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"11a2f93448b9d54da9854663936e2b73\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:34.042466 master-0 kubenswrapper[13205]: I0319 09:34:34.042206 13205 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:34.042466 master-0 kubenswrapper[13205]: I0319 09:34:34.042272 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:34.143956 master-0 kubenswrapper[13205]: I0319 09:34:34.143872 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/11a2f93448b9d54da9854663936e2b73-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"11a2f93448b9d54da9854663936e2b73\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:34.144146 master-0 kubenswrapper[13205]: I0319 09:34:34.143993 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/11a2f93448b9d54da9854663936e2b73-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"11a2f93448b9d54da9854663936e2b73\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:34.144232 master-0 kubenswrapper[13205]: I0319 09:34:34.144135 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/11a2f93448b9d54da9854663936e2b73-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"11a2f93448b9d54da9854663936e2b73\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:34.144402 master-0 kubenswrapper[13205]: I0319 09:34:34.144345 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/11a2f93448b9d54da9854663936e2b73-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"11a2f93448b9d54da9854663936e2b73\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:34.390337 master-0 kubenswrapper[13205]: I0319 09:34:34.390086 13205 generic.go:334] "Generic (PLEG): container finished" podID="d1c20f3b-cd10-4eac-88d8-70db61994bc2" containerID="c7843ba93e04bff51321b8359a36b5530e83a8a395053571f10313a5c1e683f4" exitCode=0 Mar 19 09:34:34.390337 master-0 kubenswrapper[13205]: I0319 09:34:34.390144 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"d1c20f3b-cd10-4eac-88d8-70db61994bc2","Type":"ContainerDied","Data":"c7843ba93e04bff51321b8359a36b5530e83a8a395053571f10313a5c1e683f4"} Mar 19 09:34:34.392342 master-0 kubenswrapper[13205]: I0319 09:34:34.392288 13205 generic.go:334] "Generic (PLEG): container finished" podID="8e27b7d086edf5d2cf47b703574641d8" containerID="0285ae72e28c1ba11c6c5f6c02e25350cf91b897190ff778fd152d801bc77680" exitCode=0 Mar 19 09:34:34.392342 master-0 kubenswrapper[13205]: I0319 09:34:34.392337 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="310b59e18579147352422c9e3d3401a2e5954a4c0b731fa394515b931e80f20a" Mar 19 09:34:34.392717 master-0 kubenswrapper[13205]: I0319 09:34:34.392442 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:34.426268 master-0 kubenswrapper[13205]: I0319 09:34:34.425992 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8e27b7d086edf5d2cf47b703574641d8" podUID="11a2f93448b9d54da9854663936e2b73" Mar 19 09:34:34.443435 master-0 kubenswrapper[13205]: I0319 09:34:34.443260 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8e27b7d086edf5d2cf47b703574641d8" podUID="11a2f93448b9d54da9854663936e2b73" Mar 19 09:34:34.858462 master-0 kubenswrapper[13205]: I0319 09:34:34.858402 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e27b7d086edf5d2cf47b703574641d8" path="/var/lib/kubelet/pods/8e27b7d086edf5d2cf47b703574641d8/volumes" Mar 19 09:34:35.236922 master-0 kubenswrapper[13205]: E0319 09:34:35.236709 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:34:35.772018 master-0 kubenswrapper[13205]: I0319 09:34:35.771912 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:35.870115 master-0 kubenswrapper[13205]: I0319 09:34:35.869998 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1c20f3b-cd10-4eac-88d8-70db61994bc2-kube-api-access\") pod \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\" (UID: \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\") " Mar 19 09:34:35.870115 master-0 kubenswrapper[13205]: I0319 09:34:35.870128 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1c20f3b-cd10-4eac-88d8-70db61994bc2-var-lock\") pod \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\" (UID: \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\") " Mar 19 09:34:35.870925 master-0 kubenswrapper[13205]: I0319 09:34:35.870217 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1c20f3b-cd10-4eac-88d8-70db61994bc2-kubelet-dir\") pod \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\" (UID: \"d1c20f3b-cd10-4eac-88d8-70db61994bc2\") " Mar 19 09:34:35.870925 master-0 kubenswrapper[13205]: I0319 09:34:35.870270 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c20f3b-cd10-4eac-88d8-70db61994bc2-var-lock" (OuterVolumeSpecName: "var-lock") pod "d1c20f3b-cd10-4eac-88d8-70db61994bc2" (UID: "d1c20f3b-cd10-4eac-88d8-70db61994bc2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:35.870925 master-0 kubenswrapper[13205]: I0319 09:34:35.870334 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c20f3b-cd10-4eac-88d8-70db61994bc2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d1c20f3b-cd10-4eac-88d8-70db61994bc2" (UID: "d1c20f3b-cd10-4eac-88d8-70db61994bc2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:35.870925 master-0 kubenswrapper[13205]: I0319 09:34:35.870537 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1c20f3b-cd10-4eac-88d8-70db61994bc2-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:35.870925 master-0 kubenswrapper[13205]: I0319 09:34:35.870558 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1c20f3b-cd10-4eac-88d8-70db61994bc2-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:35.873466 master-0 kubenswrapper[13205]: I0319 09:34:35.873373 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c20f3b-cd10-4eac-88d8-70db61994bc2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d1c20f3b-cd10-4eac-88d8-70db61994bc2" (UID: "d1c20f3b-cd10-4eac-88d8-70db61994bc2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:34:35.971509 master-0 kubenswrapper[13205]: I0319 09:34:35.971450 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1c20f3b-cd10-4eac-88d8-70db61994bc2-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:36.417732 master-0 kubenswrapper[13205]: I0319 09:34:36.417629 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"d1c20f3b-cd10-4eac-88d8-70db61994bc2","Type":"ContainerDied","Data":"936eba05bcac3121ff6e515dec952de8a1921b858d6cf95cb1b8dcb3c3886f59"} Mar 19 09:34:36.417732 master-0 kubenswrapper[13205]: I0319 09:34:36.417686 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="936eba05bcac3121ff6e515dec952de8a1921b858d6cf95cb1b8dcb3c3886f59" Mar 19 09:34:36.417732 master-0 kubenswrapper[13205]: I0319 09:34:36.417725 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 19 09:34:37.144543 master-0 kubenswrapper[13205]: I0319 09:34:37.144413 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:34:37.144543 master-0 kubenswrapper[13205]: I0319 09:34:37.144518 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:34:41.322467 master-0 kubenswrapper[13205]: I0319 09:34:41.322391 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:34:41.323490 master-0 kubenswrapper[13205]: I0319 09:34:41.322668 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="kube-controller-manager" containerID="cri-o://234932c6aa4854708a10bfd6ff5c0b2a32a6ce550c7885888734f2d1075fb3a5" gracePeriod=30 Mar 19 09:34:41.323490 master-0 kubenswrapper[13205]: I0319 09:34:41.322807 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://e11c7067a7cc9283dccf50eb10db382afb4e377743f71c297da2c1fc383ce771" gracePeriod=30 Mar 19 09:34:41.323490 master-0 kubenswrapper[13205]: I0319 09:34:41.322850 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://f46b0b23ccdc4101d15fea4308a57f8af72710fa5156b459dd9c1fc3d0424ef4" gracePeriod=30 Mar 19 09:34:41.323490 master-0 kubenswrapper[13205]: I0319 09:34:41.322882 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="cluster-policy-controller" containerID="cri-o://4b8296c8aab85c007fe985852836d80847020fe70c583a261ac67856bf44c2bf" gracePeriod=30 Mar 19 09:34:41.325815 master-0 kubenswrapper[13205]: I0319 09:34:41.325692 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:34:41.326308 master-0 kubenswrapper[13205]: E0319 09:34:41.326248 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="kube-controller-manager-cert-syncer" Mar 19 09:34:41.326308 master-0 kubenswrapper[13205]: I0319 09:34:41.326294 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="kube-controller-manager-cert-syncer" Mar 19 09:34:41.326452 master-0 kubenswrapper[13205]: E0319 09:34:41.326323 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="cluster-policy-controller" Mar 19 09:34:41.326452 master-0 kubenswrapper[13205]: I0319 09:34:41.326341 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="cluster-policy-controller" Mar 19 09:34:41.326452 master-0 kubenswrapper[13205]: E0319 09:34:41.326371 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c20f3b-cd10-4eac-88d8-70db61994bc2" containerName="installer" Mar 19 09:34:41.326452 master-0 kubenswrapper[13205]: I0319 09:34:41.326389 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c20f3b-cd10-4eac-88d8-70db61994bc2" containerName="installer" Mar 19 09:34:41.326452 master-0 kubenswrapper[13205]: E0319 09:34:41.326421 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="kube-controller-manager" Mar 19 09:34:41.326452 master-0 kubenswrapper[13205]: I0319 09:34:41.326438 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="kube-controller-manager" Mar 19 09:34:41.327277 master-0 kubenswrapper[13205]: E0319 09:34:41.326486 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="kube-controller-manager-recovery-controller" Mar 19 09:34:41.327277 master-0 kubenswrapper[13205]: I0319 09:34:41.326505 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="kube-controller-manager-recovery-controller" Mar 19 09:34:41.327277 master-0 kubenswrapper[13205]: I0319 09:34:41.326843 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="kube-controller-manager" Mar 19 09:34:41.327277 master-0 kubenswrapper[13205]: I0319 09:34:41.326898 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="kube-controller-manager-cert-syncer" Mar 19 09:34:41.327277 master-0 kubenswrapper[13205]: I0319 09:34:41.326923 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="kube-controller-manager-recovery-controller" Mar 19 09:34:41.327277 master-0 kubenswrapper[13205]: I0319 09:34:41.326959 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c20f3b-cd10-4eac-88d8-70db61994bc2" containerName="installer" Mar 19 09:34:41.327277 master-0 kubenswrapper[13205]: I0319 09:34:41.326992 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ef8262f7214653fecba11f5aa7ce13" containerName="cluster-policy-controller" Mar 19 09:34:41.454933 master-0 kubenswrapper[13205]: I0319 09:34:41.454857 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/78163c60e5607dc0ccb2f836459711da-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"78163c60e5607dc0ccb2f836459711da\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:41.455124 master-0 kubenswrapper[13205]: I0319 09:34:41.454940 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/78163c60e5607dc0ccb2f836459711da-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"78163c60e5607dc0ccb2f836459711da\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:41.474626 master-0 kubenswrapper[13205]: I0319 09:34:41.474388 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_01ef8262f7214653fecba11f5aa7ce13/kube-controller-manager-cert-syncer/0.log" Mar 19 09:34:41.476818 master-0 kubenswrapper[13205]: I0319 09:34:41.475403 13205 generic.go:334] "Generic (PLEG): container finished" podID="01ef8262f7214653fecba11f5aa7ce13" containerID="e11c7067a7cc9283dccf50eb10db382afb4e377743f71c297da2c1fc383ce771" exitCode=0 Mar 19 09:34:41.476818 master-0 kubenswrapper[13205]: I0319 09:34:41.475458 13205 generic.go:334] "Generic (PLEG): container finished" podID="01ef8262f7214653fecba11f5aa7ce13" containerID="f46b0b23ccdc4101d15fea4308a57f8af72710fa5156b459dd9c1fc3d0424ef4" exitCode=2 Mar 19 09:34:41.476818 master-0 kubenswrapper[13205]: I0319 09:34:41.475471 13205 generic.go:334] "Generic (PLEG): container finished" podID="01ef8262f7214653fecba11f5aa7ce13" containerID="4b8296c8aab85c007fe985852836d80847020fe70c583a261ac67856bf44c2bf" exitCode=0 Mar 19 09:34:41.476818 master-0 kubenswrapper[13205]: I0319 09:34:41.475480 13205 generic.go:334] "Generic (PLEG): container finished" podID="01ef8262f7214653fecba11f5aa7ce13" containerID="234932c6aa4854708a10bfd6ff5c0b2a32a6ce550c7885888734f2d1075fb3a5" exitCode=0 Mar 19 09:34:41.476818 master-0 kubenswrapper[13205]: I0319 09:34:41.475509 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8da454a69acb933dd50995cff47f3884f390cb29b4385920af6216097e647256" Mar 19 09:34:41.556256 master-0 kubenswrapper[13205]: I0319 09:34:41.556196 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/78163c60e5607dc0ccb2f836459711da-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"78163c60e5607dc0ccb2f836459711da\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:41.556639 master-0 kubenswrapper[13205]: I0319 09:34:41.556303 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/78163c60e5607dc0ccb2f836459711da-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"78163c60e5607dc0ccb2f836459711da\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:41.556639 master-0 kubenswrapper[13205]: I0319 09:34:41.556380 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/78163c60e5607dc0ccb2f836459711da-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"78163c60e5607dc0ccb2f836459711da\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:41.556639 master-0 kubenswrapper[13205]: I0319 09:34:41.556431 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/78163c60e5607dc0ccb2f836459711da-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"78163c60e5607dc0ccb2f836459711da\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:41.568136 master-0 kubenswrapper[13205]: I0319 09:34:41.568076 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_01ef8262f7214653fecba11f5aa7ce13/kube-controller-manager-cert-syncer/0.log" Mar 19 09:34:41.569327 master-0 kubenswrapper[13205]: I0319 09:34:41.569291 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:41.573138 master-0 kubenswrapper[13205]: I0319 09:34:41.573039 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="01ef8262f7214653fecba11f5aa7ce13" podUID="78163c60e5607dc0ccb2f836459711da" Mar 19 09:34:41.658193 master-0 kubenswrapper[13205]: I0319 09:34:41.658013 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/01ef8262f7214653fecba11f5aa7ce13-cert-dir\") pod \"01ef8262f7214653fecba11f5aa7ce13\" (UID: \"01ef8262f7214653fecba11f5aa7ce13\") " Mar 19 09:34:41.658519 master-0 kubenswrapper[13205]: I0319 09:34:41.658289 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01ef8262f7214653fecba11f5aa7ce13-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "01ef8262f7214653fecba11f5aa7ce13" (UID: "01ef8262f7214653fecba11f5aa7ce13"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:41.658809 master-0 kubenswrapper[13205]: I0319 09:34:41.658759 13205 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/01ef8262f7214653fecba11f5aa7ce13-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:41.759674 master-0 kubenswrapper[13205]: I0319 09:34:41.759574 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/01ef8262f7214653fecba11f5aa7ce13-resource-dir\") pod \"01ef8262f7214653fecba11f5aa7ce13\" (UID: \"01ef8262f7214653fecba11f5aa7ce13\") " Mar 19 09:34:41.759981 master-0 kubenswrapper[13205]: I0319 09:34:41.759800 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01ef8262f7214653fecba11f5aa7ce13-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "01ef8262f7214653fecba11f5aa7ce13" (UID: "01ef8262f7214653fecba11f5aa7ce13"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:41.760353 master-0 kubenswrapper[13205]: I0319 09:34:41.760308 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/01ef8262f7214653fecba11f5aa7ce13-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:42.487127 master-0 kubenswrapper[13205]: I0319 09:34:42.487024 13205 generic.go:334] "Generic (PLEG): container finished" podID="042705f9-eeff-4d51-808d-6da4be0720d3" containerID="807f743c51c5f0aa8dd41cf4470dc35b029874edbfece3999d20b141b8602497" exitCode=0 Mar 19 09:34:42.488135 master-0 kubenswrapper[13205]: I0319 09:34:42.487171 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:42.488135 master-0 kubenswrapper[13205]: I0319 09:34:42.487151 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"042705f9-eeff-4d51-808d-6da4be0720d3","Type":"ContainerDied","Data":"807f743c51c5f0aa8dd41cf4470dc35b029874edbfece3999d20b141b8602497"} Mar 19 09:34:42.492447 master-0 kubenswrapper[13205]: I0319 09:34:42.492381 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="01ef8262f7214653fecba11f5aa7ce13" podUID="78163c60e5607dc0ccb2f836459711da" Mar 19 09:34:42.533496 master-0 kubenswrapper[13205]: I0319 09:34:42.533414 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="01ef8262f7214653fecba11f5aa7ce13" podUID="78163c60e5607dc0ccb2f836459711da" Mar 19 09:34:42.859154 master-0 kubenswrapper[13205]: I0319 09:34:42.859103 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ef8262f7214653fecba11f5aa7ce13" path="/var/lib/kubelet/pods/01ef8262f7214653fecba11f5aa7ce13/volumes" Mar 19 09:34:43.776175 master-0 kubenswrapper[13205]: I0319 09:34:43.776124 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:43.890325 master-0 kubenswrapper[13205]: I0319 09:34:43.890237 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/042705f9-eeff-4d51-808d-6da4be0720d3-kube-api-access\") pod \"042705f9-eeff-4d51-808d-6da4be0720d3\" (UID: \"042705f9-eeff-4d51-808d-6da4be0720d3\") " Mar 19 09:34:43.890869 master-0 kubenswrapper[13205]: I0319 09:34:43.890837 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/042705f9-eeff-4d51-808d-6da4be0720d3-var-lock\") pod \"042705f9-eeff-4d51-808d-6da4be0720d3\" (UID: \"042705f9-eeff-4d51-808d-6da4be0720d3\") " Mar 19 09:34:43.891017 master-0 kubenswrapper[13205]: I0319 09:34:43.890975 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/042705f9-eeff-4d51-808d-6da4be0720d3-var-lock" (OuterVolumeSpecName: "var-lock") pod "042705f9-eeff-4d51-808d-6da4be0720d3" (UID: "042705f9-eeff-4d51-808d-6da4be0720d3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:43.891087 master-0 kubenswrapper[13205]: I0319 09:34:43.890985 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/042705f9-eeff-4d51-808d-6da4be0720d3-kubelet-dir\") pod \"042705f9-eeff-4d51-808d-6da4be0720d3\" (UID: \"042705f9-eeff-4d51-808d-6da4be0720d3\") " Mar 19 09:34:43.891170 master-0 kubenswrapper[13205]: I0319 09:34:43.891149 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/042705f9-eeff-4d51-808d-6da4be0720d3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "042705f9-eeff-4d51-808d-6da4be0720d3" (UID: "042705f9-eeff-4d51-808d-6da4be0720d3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:43.894518 master-0 kubenswrapper[13205]: I0319 09:34:43.894482 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/042705f9-eeff-4d51-808d-6da4be0720d3-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:43.894641 master-0 kubenswrapper[13205]: I0319 09:34:43.894517 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/042705f9-eeff-4d51-808d-6da4be0720d3-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:43.894742 master-0 kubenswrapper[13205]: I0319 09:34:43.894685 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/042705f9-eeff-4d51-808d-6da4be0720d3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "042705f9-eeff-4d51-808d-6da4be0720d3" (UID: "042705f9-eeff-4d51-808d-6da4be0720d3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:34:43.996237 master-0 kubenswrapper[13205]: I0319 09:34:43.996165 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/042705f9-eeff-4d51-808d-6da4be0720d3-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:44.506199 master-0 kubenswrapper[13205]: I0319 09:34:44.506140 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"042705f9-eeff-4d51-808d-6da4be0720d3","Type":"ContainerDied","Data":"69f4f783fc8ad486f7e7dc475fd1bf648e6ee7f761f2256c8685ccba9ae33d2c"} Mar 19 09:34:44.506606 master-0 kubenswrapper[13205]: I0319 09:34:44.506577 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69f4f783fc8ad486f7e7dc475fd1bf648e6ee7f761f2256c8685ccba9ae33d2c" Mar 19 09:34:44.507034 master-0 kubenswrapper[13205]: I0319 09:34:44.506883 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 09:34:45.849244 master-0 kubenswrapper[13205]: I0319 09:34:45.849129 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:45.869671 master-0 kubenswrapper[13205]: I0319 09:34:45.869613 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="2df732e3-b215-4e49-b4ec-c02d14e64425" Mar 19 09:34:45.869671 master-0 kubenswrapper[13205]: I0319 09:34:45.869655 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="2df732e3-b215-4e49-b4ec-c02d14e64425" Mar 19 09:34:45.892474 master-0 kubenswrapper[13205]: I0319 09:34:45.892401 13205 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:45.904695 master-0 kubenswrapper[13205]: I0319 09:34:45.904629 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:34:45.908655 master-0 kubenswrapper[13205]: I0319 09:34:45.908612 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:45.918196 master-0 kubenswrapper[13205]: I0319 09:34:45.918132 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:34:45.926626 master-0 kubenswrapper[13205]: I0319 09:34:45.926093 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:34:45.934279 master-0 kubenswrapper[13205]: W0319 09:34:45.934225 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11a2f93448b9d54da9854663936e2b73.slice/crio-7db13cbdfa075d7e1ee54bcbb0372b833ba22b656ab4ab77d01a35b4188a2e3b WatchSource:0}: Error finding container 7db13cbdfa075d7e1ee54bcbb0372b833ba22b656ab4ab77d01a35b4188a2e3b: Status 404 returned error can't find the container with id 7db13cbdfa075d7e1ee54bcbb0372b833ba22b656ab4ab77d01a35b4188a2e3b Mar 19 09:34:46.524834 master-0 kubenswrapper[13205]: I0319 09:34:46.524783 13205 generic.go:334] "Generic (PLEG): container finished" podID="11a2f93448b9d54da9854663936e2b73" containerID="91fc9820bbc378ac3c5d0235d69916e07ec51078314a806281d445caaaf1f9fe" exitCode=0 Mar 19 09:34:46.525076 master-0 kubenswrapper[13205]: I0319 09:34:46.524877 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"11a2f93448b9d54da9854663936e2b73","Type":"ContainerDied","Data":"91fc9820bbc378ac3c5d0235d69916e07ec51078314a806281d445caaaf1f9fe"} Mar 19 09:34:46.525076 master-0 kubenswrapper[13205]: I0319 09:34:46.524959 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"11a2f93448b9d54da9854663936e2b73","Type":"ContainerStarted","Data":"7db13cbdfa075d7e1ee54bcbb0372b833ba22b656ab4ab77d01a35b4188a2e3b"} Mar 19 09:34:47.145114 master-0 kubenswrapper[13205]: I0319 09:34:47.145044 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:34:47.145620 master-0 kubenswrapper[13205]: I0319 09:34:47.145190 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:34:47.540941 master-0 kubenswrapper[13205]: I0319 09:34:47.540870 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"11a2f93448b9d54da9854663936e2b73","Type":"ContainerStarted","Data":"d85c161426a1b0175ad90a172cca4e4d8843322ec3d411bcca9fccf3bb07ad91"} Mar 19 09:34:47.540941 master-0 kubenswrapper[13205]: I0319 09:34:47.540941 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"11a2f93448b9d54da9854663936e2b73","Type":"ContainerStarted","Data":"4ffb489960ae764e7d490cc5a515222762442a3abaf010fa816c5f4dbae9dc07"} Mar 19 09:34:47.540941 master-0 kubenswrapper[13205]: I0319 09:34:47.540959 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"11a2f93448b9d54da9854663936e2b73","Type":"ContainerStarted","Data":"372cd682ac3c0ea2bb18f78daead6727ce073fa9a81ef16d8eb3a25f2f9a5913"} Mar 19 09:34:47.544115 master-0 kubenswrapper[13205]: I0319 09:34:47.542155 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:47.569805 master-0 kubenswrapper[13205]: I0319 09:34:47.569657 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.569624777 podStartE2EDuration="2.569624777s" podCreationTimestamp="2026-03-19 09:34:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:34:47.561741272 +0000 UTC m=+672.894048200" watchObservedRunningTime="2026-03-19 09:34:47.569624777 +0000 UTC m=+672.901931695" Mar 19 09:34:50.711214 master-0 kubenswrapper[13205]: I0319 09:34:50.711155 13205 scope.go:117] "RemoveContainer" containerID="6b51526a63cb4fc4843a03fc75fd50c63454c0795793d3149e658718010b95b1" Mar 19 09:34:50.728487 master-0 kubenswrapper[13205]: I0319 09:34:50.728442 13205 scope.go:117] "RemoveContainer" containerID="793cfb93f2346e0ad23e32cbd1e114aae92c03db2ff0726f899f8a1c39d66416" Mar 19 09:34:54.725253 master-0 kubenswrapper[13205]: I0319 09:34:54.725170 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:34:54.726193 master-0 kubenswrapper[13205]: E0319 09:34:54.725638 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="042705f9-eeff-4d51-808d-6da4be0720d3" containerName="installer" Mar 19 09:34:54.726193 master-0 kubenswrapper[13205]: I0319 09:34:54.725661 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="042705f9-eeff-4d51-808d-6da4be0720d3" containerName="installer" Mar 19 09:34:54.726193 master-0 kubenswrapper[13205]: I0319 09:34:54.725921 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="042705f9-eeff-4d51-808d-6da4be0720d3" containerName="installer" Mar 19 09:34:54.726832 master-0 kubenswrapper[13205]: I0319 09:34:54.726788 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.728923 master-0 kubenswrapper[13205]: I0319 09:34:54.728873 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:34:54.729948 master-0 kubenswrapper[13205]: I0319 09:34:54.729903 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" containerID="cri-o://298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52" gracePeriod=15 Mar 19 09:34:54.730333 master-0 kubenswrapper[13205]: I0319 09:34:54.729940 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" containerID="cri-o://e603aaf263c06efbf271924eab1a41acb8214a13a832c2cab32da6d3f150d4a3" gracePeriod=15 Mar 19 09:34:54.730456 master-0 kubenswrapper[13205]: I0319 09:34:54.730011 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" containerID="cri-o://961b1c0b868cdebfe1af0e074eff504c2d3a8b6bd3a3a85e4be5eff862ba9e6d" gracePeriod=15 Mar 19 09:34:54.730564 master-0 kubenswrapper[13205]: I0319 09:34:54.730011 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://4a9d1c85aa5dd20c0a66f3ffda113b9d28087bfc64b37395becd5b86def49894" gracePeriod=15 Mar 19 09:34:54.730654 master-0 kubenswrapper[13205]: I0319 09:34:54.730027 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://2ec01576dbefe9fc38b2d53b2fd0a23949980fd88014d22013159663680ba28a" gracePeriod=15 Mar 19 09:34:54.731432 master-0 kubenswrapper[13205]: I0319 09:34:54.731374 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:34:54.732008 master-0 kubenswrapper[13205]: E0319 09:34:54.731955 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" Mar 19 09:34:54.732008 master-0 kubenswrapper[13205]: I0319 09:34:54.732002 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" Mar 19 09:34:54.732170 master-0 kubenswrapper[13205]: E0319 09:34:54.732033 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:34:54.732170 master-0 kubenswrapper[13205]: I0319 09:34:54.732053 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:34:54.732170 master-0 kubenswrapper[13205]: E0319 09:34:54.732073 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="setup" Mar 19 09:34:54.732170 master-0 kubenswrapper[13205]: I0319 09:34:54.732091 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="setup" Mar 19 09:34:54.732170 master-0 kubenswrapper[13205]: E0319 09:34:54.732116 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" Mar 19 09:34:54.732170 master-0 kubenswrapper[13205]: I0319 09:34:54.732133 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" Mar 19 09:34:54.732170 master-0 kubenswrapper[13205]: E0319 09:34:54.732165 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 09:34:54.732759 master-0 kubenswrapper[13205]: I0319 09:34:54.732183 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 09:34:54.732759 master-0 kubenswrapper[13205]: E0319 09:34:54.732216 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" Mar 19 09:34:54.732759 master-0 kubenswrapper[13205]: I0319 09:34:54.732233 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" Mar 19 09:34:54.732759 master-0 kubenswrapper[13205]: I0319 09:34:54.732596 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" Mar 19 09:34:54.732759 master-0 kubenswrapper[13205]: I0319 09:34:54.732635 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" Mar 19 09:34:54.732759 master-0 kubenswrapper[13205]: I0319 09:34:54.732669 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:34:54.732759 master-0 kubenswrapper[13205]: I0319 09:34:54.732704 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" Mar 19 09:34:54.732759 master-0 kubenswrapper[13205]: I0319 09:34:54.732726 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 09:34:54.786425 master-0 kubenswrapper[13205]: I0319 09:34:54.786354 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:34:54.858120 master-0 kubenswrapper[13205]: I0319 09:34:54.858082 13205 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:54.859018 master-0 kubenswrapper[13205]: I0319 09:34:54.858942 13205 status_manager.go:851] "Failed to get status for pod" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:54.859267 master-0 kubenswrapper[13205]: I0319 09:34:54.859245 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.859357 master-0 kubenswrapper[13205]: I0319 09:34:54.859342 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.859472 master-0 kubenswrapper[13205]: I0319 09:34:54.859460 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:34:54.859595 master-0 kubenswrapper[13205]: I0319 09:34:54.859580 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.859827 master-0 kubenswrapper[13205]: I0319 09:34:54.859808 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.859914 master-0 kubenswrapper[13205]: I0319 09:34:54.859902 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:34:54.859981 master-0 kubenswrapper[13205]: I0319 09:34:54.859969 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.860059 master-0 kubenswrapper[13205]: I0319 09:34:54.860048 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:34:54.962190 master-0 kubenswrapper[13205]: I0319 09:34:54.962127 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.962515 master-0 kubenswrapper[13205]: I0319 09:34:54.962492 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:34:54.962674 master-0 kubenswrapper[13205]: I0319 09:34:54.962652 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.962866 master-0 kubenswrapper[13205]: I0319 09:34:54.962836 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:34:54.963066 master-0 kubenswrapper[13205]: I0319 09:34:54.963049 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.963189 master-0 kubenswrapper[13205]: I0319 09:34:54.963174 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.963284 master-0 kubenswrapper[13205]: I0319 09:34:54.963267 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:34:54.963392 master-0 kubenswrapper[13205]: I0319 09:34:54.963375 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.963505 master-0 kubenswrapper[13205]: I0319 09:34:54.962561 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:34:54.963658 master-0 kubenswrapper[13205]: I0319 09:34:54.963169 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.963750 master-0 kubenswrapper[13205]: I0319 09:34:54.962798 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.963828 master-0 kubenswrapper[13205]: I0319 09:34:54.963201 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.963912 master-0 kubenswrapper[13205]: I0319 09:34:54.962970 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:34:54.964004 master-0 kubenswrapper[13205]: I0319 09:34:54.962328 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:54.964094 master-0 kubenswrapper[13205]: I0319 09:34:54.963435 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:34:54.964169 master-0 kubenswrapper[13205]: I0319 09:34:54.963442 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:55.074468 master-0 kubenswrapper[13205]: I0319 09:34:55.074407 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:34:55.109504 master-0 kubenswrapper[13205]: W0319 09:34:55.109429 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebbfbf2b56df0323ba118d68bfdad8b9.slice/crio-713e07942f5345cd6333781cde1a26ade243fddbdbecf8314885b9d3c1a06314 WatchSource:0}: Error finding container 713e07942f5345cd6333781cde1a26ade243fddbdbecf8314885b9d3c1a06314: Status 404 returned error can't find the container with id 713e07942f5345cd6333781cde1a26ade243fddbdbecf8314885b9d3c1a06314 Mar 19 09:34:55.114242 master-0 kubenswrapper[13205]: E0319 09:34:55.114102 13205 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e346127af4612 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:ebbfbf2b56df0323ba118d68bfdad8b9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:34:55.112824338 +0000 UTC m=+680.445131236,LastTimestamp:2026-03-19 09:34:55.112824338 +0000 UTC m=+680.445131236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:34:55.606984 master-0 kubenswrapper[13205]: I0319 09:34:55.606823 13205 generic.go:334] "Generic (PLEG): container finished" podID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" containerID="c76824ff9aec5a410a63bff9ce52fea6c987f42ac49fcadb16ca6270f4c9c996" exitCode=0 Mar 19 09:34:55.606984 master-0 kubenswrapper[13205]: I0319 09:34:55.606918 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"bc2139b7-8af8-4294-aee2-3e7429d2b1fe","Type":"ContainerDied","Data":"c76824ff9aec5a410a63bff9ce52fea6c987f42ac49fcadb16ca6270f4c9c996"} Mar 19 09:34:55.608262 master-0 kubenswrapper[13205]: I0319 09:34:55.608198 13205 status_manager.go:851] "Failed to get status for pod" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:55.608951 master-0 kubenswrapper[13205]: I0319 09:34:55.608900 13205 status_manager.go:851] "Failed to get status for pod" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:55.609453 master-0 kubenswrapper[13205]: I0319 09:34:55.609418 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"ebbfbf2b56df0323ba118d68bfdad8b9","Type":"ContainerStarted","Data":"d9b5e7b16faa2c1a873c5ff4df811f271a8f4f406904c27851d6b3b7a3d9065c"} Mar 19 09:34:55.609546 master-0 kubenswrapper[13205]: I0319 09:34:55.609463 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"ebbfbf2b56df0323ba118d68bfdad8b9","Type":"ContainerStarted","Data":"713e07942f5345cd6333781cde1a26ade243fddbdbecf8314885b9d3c1a06314"} Mar 19 09:34:55.611348 master-0 kubenswrapper[13205]: I0319 09:34:55.611210 13205 status_manager.go:851] "Failed to get status for pod" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:55.612903 master-0 kubenswrapper[13205]: I0319 09:34:55.612851 13205 status_manager.go:851] "Failed to get status for pod" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:55.613730 master-0 kubenswrapper[13205]: I0319 09:34:55.613686 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 19 09:34:55.614741 master-0 kubenswrapper[13205]: I0319 09:34:55.614702 13205 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="e603aaf263c06efbf271924eab1a41acb8214a13a832c2cab32da6d3f150d4a3" exitCode=0 Mar 19 09:34:55.614741 master-0 kubenswrapper[13205]: I0319 09:34:55.614726 13205 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="4a9d1c85aa5dd20c0a66f3ffda113b9d28087bfc64b37395becd5b86def49894" exitCode=0 Mar 19 09:34:55.614741 master-0 kubenswrapper[13205]: I0319 09:34:55.614734 13205 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="2ec01576dbefe9fc38b2d53b2fd0a23949980fd88014d22013159663680ba28a" exitCode=0 Mar 19 09:34:55.614741 master-0 kubenswrapper[13205]: I0319 09:34:55.614741 13205 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="961b1c0b868cdebfe1af0e074eff504c2d3a8b6bd3a3a85e4be5eff862ba9e6d" exitCode=2 Mar 19 09:34:55.848426 master-0 kubenswrapper[13205]: I0319 09:34:55.848344 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:55.850075 master-0 kubenswrapper[13205]: I0319 09:34:55.850013 13205 status_manager.go:851] "Failed to get status for pod" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:55.850745 master-0 kubenswrapper[13205]: I0319 09:34:55.850682 13205 status_manager.go:851] "Failed to get status for pod" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:55.886491 master-0 kubenswrapper[13205]: I0319 09:34:55.886410 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:34:55.886491 master-0 kubenswrapper[13205]: I0319 09:34:55.886448 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:34:55.887289 master-0 kubenswrapper[13205]: E0319 09:34:55.887229 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:55.887847 master-0 kubenswrapper[13205]: I0319 09:34:55.887803 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:56.134192 master-0 kubenswrapper[13205]: E0319 09:34:56.134019 13205 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e346127af4612 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:ebbfbf2b56df0323ba118d68bfdad8b9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:34:55.112824338 +0000 UTC m=+680.445131236,LastTimestamp:2026-03-19 09:34:55.112824338 +0000 UTC m=+680.445131236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:34:56.637296 master-0 kubenswrapper[13205]: I0319 09:34:56.629062 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"78163c60e5607dc0ccb2f836459711da","Type":"ContainerStarted","Data":"909226b2685511d6bab55ace265d3f240cb558432b507591e242a2a343509a3c"} Mar 19 09:34:56.637296 master-0 kubenswrapper[13205]: I0319 09:34:56.629138 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"78163c60e5607dc0ccb2f836459711da","Type":"ContainerStarted","Data":"9891f7b5295dc9e748541b1d5291c66e77a0ec82f3b11cb284bbd29bce4baf72"} Mar 19 09:34:56.637296 master-0 kubenswrapper[13205]: I0319 09:34:56.629159 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"78163c60e5607dc0ccb2f836459711da","Type":"ContainerStarted","Data":"9081822f793a182b6c3527a8e97a4cf0aa7422edceba5bbd9f78d610f192b334"} Mar 19 09:34:57.112458 master-0 kubenswrapper[13205]: I0319 09:34:57.112416 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:57.113676 master-0 kubenswrapper[13205]: I0319 09:34:57.113632 13205 status_manager.go:851] "Failed to get status for pod" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.114402 master-0 kubenswrapper[13205]: I0319 09:34:57.114369 13205 status_manager.go:851] "Failed to get status for pod" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.144304 master-0 kubenswrapper[13205]: I0319 09:34:57.144268 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:34:57.144484 master-0 kubenswrapper[13205]: I0319 09:34:57.144348 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:34:57.157202 master-0 kubenswrapper[13205]: I0319 09:34:57.157167 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 19 09:34:57.158241 master-0 kubenswrapper[13205]: I0319 09:34:57.158205 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:34:57.161039 master-0 kubenswrapper[13205]: I0319 09:34:57.160998 13205 status_manager.go:851] "Failed to get status for pod" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.161622 master-0 kubenswrapper[13205]: I0319 09:34:57.161555 13205 status_manager.go:851] "Failed to get status for pod" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.162048 master-0 kubenswrapper[13205]: I0319 09:34:57.162012 13205 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.225428 master-0 kubenswrapper[13205]: I0319 09:34:57.225271 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-kubelet-dir\") pod \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\" (UID: \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\") " Mar 19 09:34:57.225428 master-0 kubenswrapper[13205]: I0319 09:34:57.225401 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bc2139b7-8af8-4294-aee2-3e7429d2b1fe" (UID: "bc2139b7-8af8-4294-aee2-3e7429d2b1fe"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:57.225794 master-0 kubenswrapper[13205]: I0319 09:34:57.225459 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-kube-api-access\") pod \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\" (UID: \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\") " Mar 19 09:34:57.225794 master-0 kubenswrapper[13205]: I0319 09:34:57.225508 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-var-lock\") pod \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\" (UID: \"bc2139b7-8af8-4294-aee2-3e7429d2b1fe\") " Mar 19 09:34:57.226098 master-0 kubenswrapper[13205]: I0319 09:34:57.226056 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:57.226192 master-0 kubenswrapper[13205]: I0319 09:34:57.226148 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-var-lock" (OuterVolumeSpecName: "var-lock") pod "bc2139b7-8af8-4294-aee2-3e7429d2b1fe" (UID: "bc2139b7-8af8-4294-aee2-3e7429d2b1fe"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:57.228818 master-0 kubenswrapper[13205]: I0319 09:34:57.228736 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bc2139b7-8af8-4294-aee2-3e7429d2b1fe" (UID: "bc2139b7-8af8-4294-aee2-3e7429d2b1fe"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:34:57.326706 master-0 kubenswrapper[13205]: I0319 09:34:57.326599 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"b45ea2ef1cf2bc9d1d994d6538ae0a64\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " Mar 19 09:34:57.326706 master-0 kubenswrapper[13205]: I0319 09:34:57.326692 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"b45ea2ef1cf2bc9d1d994d6538ae0a64\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " Mar 19 09:34:57.326706 master-0 kubenswrapper[13205]: I0319 09:34:57.326719 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"b45ea2ef1cf2bc9d1d994d6538ae0a64\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " Mar 19 09:34:57.327153 master-0 kubenswrapper[13205]: I0319 09:34:57.326911 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "b45ea2ef1cf2bc9d1d994d6538ae0a64" (UID: "b45ea2ef1cf2bc9d1d994d6538ae0a64"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:57.327153 master-0 kubenswrapper[13205]: I0319 09:34:57.326911 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b45ea2ef1cf2bc9d1d994d6538ae0a64" (UID: "b45ea2ef1cf2bc9d1d994d6538ae0a64"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:57.327153 master-0 kubenswrapper[13205]: I0319 09:34:57.327005 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "b45ea2ef1cf2bc9d1d994d6538ae0a64" (UID: "b45ea2ef1cf2bc9d1d994d6538ae0a64"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:34:57.327153 master-0 kubenswrapper[13205]: I0319 09:34:57.327040 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:57.327153 master-0 kubenswrapper[13205]: I0319 09:34:57.327080 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc2139b7-8af8-4294-aee2-3e7429d2b1fe-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:57.429343 master-0 kubenswrapper[13205]: I0319 09:34:57.429225 13205 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:57.429343 master-0 kubenswrapper[13205]: I0319 09:34:57.429307 13205 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:57.429343 master-0 kubenswrapper[13205]: I0319 09:34:57.429332 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:34:57.651118 master-0 kubenswrapper[13205]: I0319 09:34:57.651041 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"bc2139b7-8af8-4294-aee2-3e7429d2b1fe","Type":"ContainerDied","Data":"56eab5e1b1b74c5337a671c5cae69f468938e2992f65bdf05f6eb6d21a30bc2c"} Mar 19 09:34:57.651118 master-0 kubenswrapper[13205]: I0319 09:34:57.651095 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56eab5e1b1b74c5337a671c5cae69f468938e2992f65bdf05f6eb6d21a30bc2c" Mar 19 09:34:57.651118 master-0 kubenswrapper[13205]: I0319 09:34:57.651053 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:34:57.655494 master-0 kubenswrapper[13205]: I0319 09:34:57.655423 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"78163c60e5607dc0ccb2f836459711da","Type":"ContainerStarted","Data":"1fa1b12321a3dee01daab35ab4e7b817db6ce8632ee7561cb941776b17b4a6df"} Mar 19 09:34:57.655494 master-0 kubenswrapper[13205]: I0319 09:34:57.655492 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"78163c60e5607dc0ccb2f836459711da","Type":"ContainerStarted","Data":"e69e0a00be938327367cfad5fbbfef5b29328de2c5267b1b1fa3b89a40ee396f"} Mar 19 09:34:57.655979 master-0 kubenswrapper[13205]: I0319 09:34:57.655913 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:34:57.655979 master-0 kubenswrapper[13205]: I0319 09:34:57.655971 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:34:57.657544 master-0 kubenswrapper[13205]: E0319 09:34:57.657134 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:57.658724 master-0 kubenswrapper[13205]: I0319 09:34:57.658652 13205 status_manager.go:851] "Failed to get status for pod" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.659448 master-0 kubenswrapper[13205]: I0319 09:34:57.659383 13205 status_manager.go:851] "Failed to get status for pod" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.660185 master-0 kubenswrapper[13205]: I0319 09:34:57.660124 13205 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.664186 master-0 kubenswrapper[13205]: I0319 09:34:57.663998 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 19 09:34:57.665717 master-0 kubenswrapper[13205]: I0319 09:34:57.665626 13205 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52" exitCode=0 Mar 19 09:34:57.665849 master-0 kubenswrapper[13205]: I0319 09:34:57.665732 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:34:57.665849 master-0 kubenswrapper[13205]: I0319 09:34:57.665761 13205 scope.go:117] "RemoveContainer" containerID="e603aaf263c06efbf271924eab1a41acb8214a13a832c2cab32da6d3f150d4a3" Mar 19 09:34:57.669063 master-0 kubenswrapper[13205]: I0319 09:34:57.668980 13205 status_manager.go:851] "Failed to get status for pod" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.670130 master-0 kubenswrapper[13205]: I0319 09:34:57.669891 13205 status_manager.go:851] "Failed to get status for pod" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.671074 master-0 kubenswrapper[13205]: I0319 09:34:57.670985 13205 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.696778 master-0 kubenswrapper[13205]: I0319 09:34:57.696706 13205 scope.go:117] "RemoveContainer" containerID="4a9d1c85aa5dd20c0a66f3ffda113b9d28087bfc64b37395becd5b86def49894" Mar 19 09:34:57.698285 master-0 kubenswrapper[13205]: I0319 09:34:57.698201 13205 status_manager.go:851] "Failed to get status for pod" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.699120 master-0 kubenswrapper[13205]: I0319 09:34:57.699050 13205 status_manager.go:851] "Failed to get status for pod" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.699849 master-0 kubenswrapper[13205]: I0319 09:34:57.699794 13205 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:34:57.727209 master-0 kubenswrapper[13205]: I0319 09:34:57.727091 13205 scope.go:117] "RemoveContainer" containerID="2ec01576dbefe9fc38b2d53b2fd0a23949980fd88014d22013159663680ba28a" Mar 19 09:34:57.750831 master-0 kubenswrapper[13205]: I0319 09:34:57.750756 13205 scope.go:117] "RemoveContainer" containerID="961b1c0b868cdebfe1af0e074eff504c2d3a8b6bd3a3a85e4be5eff862ba9e6d" Mar 19 09:34:57.793389 master-0 kubenswrapper[13205]: I0319 09:34:57.793351 13205 scope.go:117] "RemoveContainer" containerID="298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52" Mar 19 09:34:57.820003 master-0 kubenswrapper[13205]: I0319 09:34:57.819973 13205 scope.go:117] "RemoveContainer" containerID="8567622300f2b47f4209289926c8cdce9e82c9c3e11d93e06f8c06befbb56ca9" Mar 19 09:34:57.839183 master-0 kubenswrapper[13205]: I0319 09:34:57.839147 13205 scope.go:117] "RemoveContainer" containerID="e603aaf263c06efbf271924eab1a41acb8214a13a832c2cab32da6d3f150d4a3" Mar 19 09:34:57.839997 master-0 kubenswrapper[13205]: E0319 09:34:57.839896 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e603aaf263c06efbf271924eab1a41acb8214a13a832c2cab32da6d3f150d4a3\": container with ID starting with e603aaf263c06efbf271924eab1a41acb8214a13a832c2cab32da6d3f150d4a3 not found: ID does not exist" containerID="e603aaf263c06efbf271924eab1a41acb8214a13a832c2cab32da6d3f150d4a3" Mar 19 09:34:57.839997 master-0 kubenswrapper[13205]: I0319 09:34:57.839985 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e603aaf263c06efbf271924eab1a41acb8214a13a832c2cab32da6d3f150d4a3"} err="failed to get container status \"e603aaf263c06efbf271924eab1a41acb8214a13a832c2cab32da6d3f150d4a3\": rpc error: code = NotFound desc = could not find container \"e603aaf263c06efbf271924eab1a41acb8214a13a832c2cab32da6d3f150d4a3\": container with ID starting with e603aaf263c06efbf271924eab1a41acb8214a13a832c2cab32da6d3f150d4a3 not found: ID does not exist" Mar 19 09:34:57.840273 master-0 kubenswrapper[13205]: I0319 09:34:57.840010 13205 scope.go:117] "RemoveContainer" containerID="4a9d1c85aa5dd20c0a66f3ffda113b9d28087bfc64b37395becd5b86def49894" Mar 19 09:34:57.840490 master-0 kubenswrapper[13205]: E0319 09:34:57.840436 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a9d1c85aa5dd20c0a66f3ffda113b9d28087bfc64b37395becd5b86def49894\": container with ID starting with 4a9d1c85aa5dd20c0a66f3ffda113b9d28087bfc64b37395becd5b86def49894 not found: ID does not exist" containerID="4a9d1c85aa5dd20c0a66f3ffda113b9d28087bfc64b37395becd5b86def49894" Mar 19 09:34:57.840490 master-0 kubenswrapper[13205]: I0319 09:34:57.840462 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9d1c85aa5dd20c0a66f3ffda113b9d28087bfc64b37395becd5b86def49894"} err="failed to get container status \"4a9d1c85aa5dd20c0a66f3ffda113b9d28087bfc64b37395becd5b86def49894\": rpc error: code = NotFound desc = could not find container \"4a9d1c85aa5dd20c0a66f3ffda113b9d28087bfc64b37395becd5b86def49894\": container with ID starting with 4a9d1c85aa5dd20c0a66f3ffda113b9d28087bfc64b37395becd5b86def49894 not found: ID does not exist" Mar 19 09:34:57.840490 master-0 kubenswrapper[13205]: I0319 09:34:57.840477 13205 scope.go:117] "RemoveContainer" containerID="2ec01576dbefe9fc38b2d53b2fd0a23949980fd88014d22013159663680ba28a" Mar 19 09:34:57.840939 master-0 kubenswrapper[13205]: E0319 09:34:57.840868 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ec01576dbefe9fc38b2d53b2fd0a23949980fd88014d22013159663680ba28a\": container with ID starting with 2ec01576dbefe9fc38b2d53b2fd0a23949980fd88014d22013159663680ba28a not found: ID does not exist" containerID="2ec01576dbefe9fc38b2d53b2fd0a23949980fd88014d22013159663680ba28a" Mar 19 09:34:57.841056 master-0 kubenswrapper[13205]: I0319 09:34:57.840954 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ec01576dbefe9fc38b2d53b2fd0a23949980fd88014d22013159663680ba28a"} err="failed to get container status \"2ec01576dbefe9fc38b2d53b2fd0a23949980fd88014d22013159663680ba28a\": rpc error: code = NotFound desc = could not find container \"2ec01576dbefe9fc38b2d53b2fd0a23949980fd88014d22013159663680ba28a\": container with ID starting with 2ec01576dbefe9fc38b2d53b2fd0a23949980fd88014d22013159663680ba28a not found: ID does not exist" Mar 19 09:34:57.841056 master-0 kubenswrapper[13205]: I0319 09:34:57.841012 13205 scope.go:117] "RemoveContainer" containerID="961b1c0b868cdebfe1af0e074eff504c2d3a8b6bd3a3a85e4be5eff862ba9e6d" Mar 19 09:34:57.841513 master-0 kubenswrapper[13205]: E0319 09:34:57.841455 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"961b1c0b868cdebfe1af0e074eff504c2d3a8b6bd3a3a85e4be5eff862ba9e6d\": container with ID starting with 961b1c0b868cdebfe1af0e074eff504c2d3a8b6bd3a3a85e4be5eff862ba9e6d not found: ID does not exist" containerID="961b1c0b868cdebfe1af0e074eff504c2d3a8b6bd3a3a85e4be5eff862ba9e6d" Mar 19 09:34:57.841513 master-0 kubenswrapper[13205]: I0319 09:34:57.841484 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961b1c0b868cdebfe1af0e074eff504c2d3a8b6bd3a3a85e4be5eff862ba9e6d"} err="failed to get container status \"961b1c0b868cdebfe1af0e074eff504c2d3a8b6bd3a3a85e4be5eff862ba9e6d\": rpc error: code = NotFound desc = could not find container \"961b1c0b868cdebfe1af0e074eff504c2d3a8b6bd3a3a85e4be5eff862ba9e6d\": container with ID starting with 961b1c0b868cdebfe1af0e074eff504c2d3a8b6bd3a3a85e4be5eff862ba9e6d not found: ID does not exist" Mar 19 09:34:57.841513 master-0 kubenswrapper[13205]: I0319 09:34:57.841573 13205 scope.go:117] "RemoveContainer" containerID="298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52" Mar 19 09:34:57.842094 master-0 kubenswrapper[13205]: E0319 09:34:57.841991 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52\": container with ID starting with 298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52 not found: ID does not exist" containerID="298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52" Mar 19 09:34:57.842209 master-0 kubenswrapper[13205]: I0319 09:34:57.842076 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52"} err="failed to get container status \"298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52\": rpc error: code = NotFound desc = could not find container \"298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52\": container with ID starting with 298d17274e038b962e91bd5b84b13151b4f915726949941dec5cbd266a39de52 not found: ID does not exist" Mar 19 09:34:57.842209 master-0 kubenswrapper[13205]: I0319 09:34:57.842137 13205 scope.go:117] "RemoveContainer" containerID="8567622300f2b47f4209289926c8cdce9e82c9c3e11d93e06f8c06befbb56ca9" Mar 19 09:34:57.842627 master-0 kubenswrapper[13205]: E0319 09:34:57.842548 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8567622300f2b47f4209289926c8cdce9e82c9c3e11d93e06f8c06befbb56ca9\": container with ID starting with 8567622300f2b47f4209289926c8cdce9e82c9c3e11d93e06f8c06befbb56ca9 not found: ID does not exist" containerID="8567622300f2b47f4209289926c8cdce9e82c9c3e11d93e06f8c06befbb56ca9" Mar 19 09:34:57.842627 master-0 kubenswrapper[13205]: I0319 09:34:57.842574 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8567622300f2b47f4209289926c8cdce9e82c9c3e11d93e06f8c06befbb56ca9"} err="failed to get container status \"8567622300f2b47f4209289926c8cdce9e82c9c3e11d93e06f8c06befbb56ca9\": rpc error: code = NotFound desc = could not find container \"8567622300f2b47f4209289926c8cdce9e82c9c3e11d93e06f8c06befbb56ca9\": container with ID starting with 8567622300f2b47f4209289926c8cdce9e82c9c3e11d93e06f8c06befbb56ca9 not found: ID does not exist" Mar 19 09:34:58.678801 master-0 kubenswrapper[13205]: I0319 09:34:58.678741 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:34:58.678801 master-0 kubenswrapper[13205]: I0319 09:34:58.678779 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:34:58.679621 master-0 kubenswrapper[13205]: E0319 09:34:58.679405 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:34:58.858678 master-0 kubenswrapper[13205]: I0319 09:34:58.858597 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" path="/var/lib/kubelet/pods/b45ea2ef1cf2bc9d1d994d6538ae0a64/volumes" Mar 19 09:35:03.473122 master-0 kubenswrapper[13205]: E0319 09:35:03.473030 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:03.474191 master-0 kubenswrapper[13205]: E0319 09:35:03.473943 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:03.474990 master-0 kubenswrapper[13205]: E0319 09:35:03.474938 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:03.475834 master-0 kubenswrapper[13205]: E0319 09:35:03.475767 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:03.476804 master-0 kubenswrapper[13205]: E0319 09:35:03.476741 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:03.476804 master-0 kubenswrapper[13205]: I0319 09:35:03.476773 13205 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 09:35:03.477951 master-0 kubenswrapper[13205]: E0319 09:35:03.477886 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 09:35:03.679885 master-0 kubenswrapper[13205]: E0319 09:35:03.679765 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 09:35:04.081996 master-0 kubenswrapper[13205]: E0319 09:35:04.081760 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 09:35:04.867223 master-0 kubenswrapper[13205]: I0319 09:35:04.867101 13205 status_manager.go:851] "Failed to get status for pod" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:04.868820 master-0 kubenswrapper[13205]: I0319 09:35:04.868712 13205 status_manager.go:851] "Failed to get status for pod" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:04.870565 master-0 kubenswrapper[13205]: I0319 09:35:04.870473 13205 status_manager.go:851] "Failed to get status for pod" podUID="78163c60e5607dc0ccb2f836459711da" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:04.884064 master-0 kubenswrapper[13205]: E0319 09:35:04.883988 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 09:35:05.888860 master-0 kubenswrapper[13205]: I0319 09:35:05.888754 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:35:05.888860 master-0 kubenswrapper[13205]: I0319 09:35:05.888845 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:35:05.888860 master-0 kubenswrapper[13205]: I0319 09:35:05.888876 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:35:05.890118 master-0 kubenswrapper[13205]: I0319 09:35:05.888898 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:35:05.890118 master-0 kubenswrapper[13205]: I0319 09:35:05.889024 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 09:35:05.890118 master-0 kubenswrapper[13205]: I0319 09:35:05.889101 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 09:35:05.890421 master-0 kubenswrapper[13205]: I0319 09:35:05.890248 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:35:05.890421 master-0 kubenswrapper[13205]: I0319 09:35:05.890284 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:35:05.891630 master-0 kubenswrapper[13205]: E0319 09:35:05.891521 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:35:06.135430 master-0 kubenswrapper[13205]: E0319 09:35:06.135223 13205 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e346127af4612 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:ebbfbf2b56df0323ba118d68bfdad8b9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:34:55.112824338 +0000 UTC m=+680.445131236,LastTimestamp:2026-03-19 09:34:55.112824338 +0000 UTC m=+680.445131236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:35:06.486365 master-0 kubenswrapper[13205]: E0319 09:35:06.486280 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 09:35:07.144314 master-0 kubenswrapper[13205]: I0319 09:35:07.144210 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:35:07.145204 master-0 kubenswrapper[13205]: I0319 09:35:07.144316 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:35:08.889390 master-0 kubenswrapper[13205]: I0319 09:35:08.889276 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:35:08.889390 master-0 kubenswrapper[13205]: I0319 09:35:08.889393 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="78163c60e5607dc0ccb2f836459711da" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:35:09.688393 master-0 kubenswrapper[13205]: E0319 09:35:09.688282 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 19 09:35:09.848510 master-0 kubenswrapper[13205]: I0319 09:35:09.848442 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:35:09.850122 master-0 kubenswrapper[13205]: I0319 09:35:09.850036 13205 status_manager.go:851] "Failed to get status for pod" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:09.851210 master-0 kubenswrapper[13205]: I0319 09:35:09.851152 13205 status_manager.go:851] "Failed to get status for pod" podUID="78163c60e5607dc0ccb2f836459711da" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:09.852042 master-0 kubenswrapper[13205]: I0319 09:35:09.851989 13205 status_manager.go:851] "Failed to get status for pod" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:09.867692 master-0 kubenswrapper[13205]: I0319 09:35:09.867631 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="13e99065-2a42-467c-8585-97de1dffebb8" Mar 19 09:35:09.867692 master-0 kubenswrapper[13205]: I0319 09:35:09.867685 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="13e99065-2a42-467c-8585-97de1dffebb8" Mar 19 09:35:09.868722 master-0 kubenswrapper[13205]: E0319 09:35:09.868662 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:35:09.869383 master-0 kubenswrapper[13205]: I0319 09:35:09.869344 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:35:09.912686 master-0 kubenswrapper[13205]: W0319 09:35:09.912598 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod274c4bebf95a655851b2cf276fe43ef7.slice/crio-130e993a6929b0891605ffc6e4867356165cecb5b8f47229f0e5b41542b3cc3e WatchSource:0}: Error finding container 130e993a6929b0891605ffc6e4867356165cecb5b8f47229f0e5b41542b3cc3e: Status 404 returned error can't find the container with id 130e993a6929b0891605ffc6e4867356165cecb5b8f47229f0e5b41542b3cc3e Mar 19 09:35:10.775668 master-0 kubenswrapper[13205]: I0319 09:35:10.775602 13205 generic.go:334] "Generic (PLEG): container finished" podID="274c4bebf95a655851b2cf276fe43ef7" containerID="400fc105ac6b39b7a87f9cae46662b18bb8d873f2a121117d93d4586715760b7" exitCode=0 Mar 19 09:35:10.775668 master-0 kubenswrapper[13205]: I0319 09:35:10.775672 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerDied","Data":"400fc105ac6b39b7a87f9cae46662b18bb8d873f2a121117d93d4586715760b7"} Mar 19 09:35:10.776006 master-0 kubenswrapper[13205]: I0319 09:35:10.775714 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"130e993a6929b0891605ffc6e4867356165cecb5b8f47229f0e5b41542b3cc3e"} Mar 19 09:35:10.776056 master-0 kubenswrapper[13205]: I0319 09:35:10.776011 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="13e99065-2a42-467c-8585-97de1dffebb8" Mar 19 09:35:10.776056 master-0 kubenswrapper[13205]: I0319 09:35:10.776025 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="13e99065-2a42-467c-8585-97de1dffebb8" Mar 19 09:35:10.777353 master-0 kubenswrapper[13205]: E0319 09:35:10.776928 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:35:10.777353 master-0 kubenswrapper[13205]: I0319 09:35:10.776976 13205 status_manager.go:851] "Failed to get status for pod" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:10.777761 master-0 kubenswrapper[13205]: I0319 09:35:10.777575 13205 status_manager.go:851] "Failed to get status for pod" podUID="78163c60e5607dc0ccb2f836459711da" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:10.778432 master-0 kubenswrapper[13205]: I0319 09:35:10.778402 13205 status_manager.go:851] "Failed to get status for pod" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:35:11.792019 master-0 kubenswrapper[13205]: I0319 09:35:11.791971 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"622c0abe71e7d2d01e416c3b3d9b79cd36de8ab264a68375ae57bedde186d74f"} Mar 19 09:35:11.792019 master-0 kubenswrapper[13205]: I0319 09:35:11.792021 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"dda2eeb89eba23a9330d36c63c58647d4e1a8606023fc7e613231cdbeef9e880"} Mar 19 09:35:11.792541 master-0 kubenswrapper[13205]: I0319 09:35:11.792036 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"f44908c788b64295105837947fd6893b0992d0ed4948253b4f5b18e1b7fa0acb"} Mar 19 09:35:11.792541 master-0 kubenswrapper[13205]: I0319 09:35:11.792049 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"9fdb611902d12b2a92243866d2f2faa5150f2625e0d4e605ba1a9f99f66b1118"} Mar 19 09:35:12.806813 master-0 kubenswrapper[13205]: I0319 09:35:12.806755 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"076538bd4c7100a0612055eb8146732b5393bb4a40cba04223f3c3e0ec6d4f97"} Mar 19 09:35:12.807399 master-0 kubenswrapper[13205]: I0319 09:35:12.806940 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:35:12.807399 master-0 kubenswrapper[13205]: I0319 09:35:12.807051 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="13e99065-2a42-467c-8585-97de1dffebb8" Mar 19 09:35:12.807399 master-0 kubenswrapper[13205]: I0319 09:35:12.807079 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="13e99065-2a42-467c-8585-97de1dffebb8" Mar 19 09:35:14.870067 master-0 kubenswrapper[13205]: I0319 09:35:14.869986 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:35:14.870730 master-0 kubenswrapper[13205]: I0319 09:35:14.870080 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:35:14.882101 master-0 kubenswrapper[13205]: I0319 09:35:14.882044 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:35:15.889264 master-0 kubenswrapper[13205]: I0319 09:35:15.889198 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 09:35:15.889264 master-0 kubenswrapper[13205]: I0319 09:35:15.889263 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 09:35:17.144264 master-0 kubenswrapper[13205]: I0319 09:35:17.144199 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:35:17.144264 master-0 kubenswrapper[13205]: I0319 09:35:17.144257 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:35:17.664110 master-0 kubenswrapper[13205]: I0319 09:35:17.664027 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:35:17.666508 master-0 kubenswrapper[13205]: I0319 09:35:17.664610 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:35:17.666508 master-0 kubenswrapper[13205]: I0319 09:35:17.664639 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:35:17.669735 master-0 kubenswrapper[13205]: I0319 09:35:17.669682 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:35:17.686000 master-0 kubenswrapper[13205]: I0319 09:35:17.685946 13205 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:35:17.821841 master-0 kubenswrapper[13205]: I0319 09:35:17.821788 13205 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:35:17.841998 master-0 kubenswrapper[13205]: I0319 09:35:17.841943 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="274c4bebf95a655851b2cf276fe43ef7" podUID="3a14cb29-e2eb-41ac-b6fa-8fc7e6411416" Mar 19 09:35:17.851781 master-0 kubenswrapper[13205]: I0319 09:35:17.851728 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:35:17.851781 master-0 kubenswrapper[13205]: I0319 09:35:17.851757 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:35:17.852033 master-0 kubenswrapper[13205]: I0319 09:35:17.851818 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="13e99065-2a42-467c-8585-97de1dffebb8" Mar 19 09:35:17.852033 master-0 kubenswrapper[13205]: I0319 09:35:17.851858 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="13e99065-2a42-467c-8585-97de1dffebb8" Mar 19 09:35:17.862814 master-0 kubenswrapper[13205]: I0319 09:35:17.862752 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="78163c60e5607dc0ccb2f836459711da" podUID="7ab8a1cf-a389-4cc9-aded-1c30fc67bebb" Mar 19 09:35:17.872405 master-0 kubenswrapper[13205]: I0319 09:35:17.872349 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="274c4bebf95a655851b2cf276fe43ef7" podUID="3a14cb29-e2eb-41ac-b6fa-8fc7e6411416" Mar 19 09:35:26.982331 master-0 kubenswrapper[13205]: I0319 09:35:26.982246 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 09:35:27.144694 master-0 kubenswrapper[13205]: I0319 09:35:27.144609 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:35:27.144951 master-0 kubenswrapper[13205]: I0319 09:35:27.144709 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:35:27.455374 master-0 kubenswrapper[13205]: I0319 09:35:27.455252 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-qn2w4" Mar 19 09:35:27.737010 master-0 kubenswrapper[13205]: I0319 09:35:27.736856 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:35:27.845729 master-0 kubenswrapper[13205]: I0319 09:35:27.845633 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 09:35:27.926920 master-0 kubenswrapper[13205]: I0319 09:35:27.926846 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-pvd7b" Mar 19 09:35:28.136154 master-0 kubenswrapper[13205]: I0319 09:35:28.135825 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 09:35:28.526415 master-0 kubenswrapper[13205]: I0319 09:35:28.526351 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 09:35:28.591076 master-0 kubenswrapper[13205]: I0319 09:35:28.591003 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:35:28.711088 master-0 kubenswrapper[13205]: I0319 09:35:28.710924 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 09:35:28.728680 master-0 kubenswrapper[13205]: I0319 09:35:28.728583 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 09:35:28.871982 master-0 kubenswrapper[13205]: I0319 09:35:28.871896 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-khk7h" Mar 19 09:35:28.917147 master-0 kubenswrapper[13205]: I0319 09:35:28.917056 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:35:29.309269 master-0 kubenswrapper[13205]: I0319 09:35:29.309171 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:35:29.444264 master-0 kubenswrapper[13205]: I0319 09:35:29.444195 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 09:35:29.562146 master-0 kubenswrapper[13205]: I0319 09:35:29.561967 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 09:35:29.641025 master-0 kubenswrapper[13205]: I0319 09:35:29.640946 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 09:35:29.650049 master-0 kubenswrapper[13205]: I0319 09:35:29.649998 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 09:35:29.660682 master-0 kubenswrapper[13205]: I0319 09:35:29.660641 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 09:35:29.852243 master-0 kubenswrapper[13205]: I0319 09:35:29.852093 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 09:35:29.922198 master-0 kubenswrapper[13205]: I0319 09:35:29.922144 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:35:29.930668 master-0 kubenswrapper[13205]: I0319 09:35:29.930618 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:35:29.950878 master-0 kubenswrapper[13205]: I0319 09:35:29.950729 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:35:29.962405 master-0 kubenswrapper[13205]: I0319 09:35:29.962338 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 09:35:30.205502 master-0 kubenswrapper[13205]: I0319 09:35:30.205311 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:35:30.309492 master-0 kubenswrapper[13205]: I0319 09:35:30.309394 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-lsnll" Mar 19 09:35:30.328820 master-0 kubenswrapper[13205]: I0319 09:35:30.328747 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"tuned-dockercfg-czcgc" Mar 19 09:35:30.431900 master-0 kubenswrapper[13205]: I0319 09:35:30.431858 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:35:30.455167 master-0 kubenswrapper[13205]: I0319 09:35:30.455101 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 09:35:30.492890 master-0 kubenswrapper[13205]: I0319 09:35:30.492764 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:35:30.507682 master-0 kubenswrapper[13205]: I0319 09:35:30.507636 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:35:30.509375 master-0 kubenswrapper[13205]: I0319 09:35:30.509324 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 09:35:30.596397 master-0 kubenswrapper[13205]: I0319 09:35:30.595517 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 09:35:30.607403 master-0 kubenswrapper[13205]: I0319 09:35:30.607337 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-sjz5s" Mar 19 09:35:30.609617 master-0 kubenswrapper[13205]: I0319 09:35:30.609580 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 09:35:30.671451 master-0 kubenswrapper[13205]: I0319 09:35:30.671422 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:35:30.687240 master-0 kubenswrapper[13205]: I0319 09:35:30.687188 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 09:35:30.718796 master-0 kubenswrapper[13205]: I0319 09:35:30.718378 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:35:30.737738 master-0 kubenswrapper[13205]: I0319 09:35:30.735268 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-88lgr" Mar 19 09:35:30.792066 master-0 kubenswrapper[13205]: I0319 09:35:30.792006 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:35:30.918359 master-0 kubenswrapper[13205]: I0319 09:35:30.918283 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:35:31.014443 master-0 kubenswrapper[13205]: I0319 09:35:31.014375 13205 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:35:31.019635 master-0 kubenswrapper[13205]: I0319 09:35:31.019588 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 09:35:31.029751 master-0 kubenswrapper[13205]: I0319 09:35:31.029696 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:35:31.083220 master-0 kubenswrapper[13205]: I0319 09:35:31.083100 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 09:35:31.150899 master-0 kubenswrapper[13205]: I0319 09:35:31.150815 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 09:35:31.166751 master-0 kubenswrapper[13205]: I0319 09:35:31.166682 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 09:35:31.174229 master-0 kubenswrapper[13205]: I0319 09:35:31.174176 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 09:35:31.277570 master-0 kubenswrapper[13205]: I0319 09:35:31.276450 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 09:35:31.311277 master-0 kubenswrapper[13205]: I0319 09:35:31.311204 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 09:35:31.437052 master-0 kubenswrapper[13205]: I0319 09:35:31.436936 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-p8jxl" Mar 19 09:35:31.456179 master-0 kubenswrapper[13205]: I0319 09:35:31.456124 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:35:31.563370 master-0 kubenswrapper[13205]: I0319 09:35:31.563304 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 09:35:31.607616 master-0 kubenswrapper[13205]: I0319 09:35:31.607540 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 09:35:31.631516 master-0 kubenswrapper[13205]: I0319 09:35:31.631450 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:35:31.632300 master-0 kubenswrapper[13205]: I0319 09:35:31.632238 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:35:31.696989 master-0 kubenswrapper[13205]: I0319 09:35:31.696817 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:35:31.756356 master-0 kubenswrapper[13205]: I0319 09:35:31.756296 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-b9dtc" Mar 19 09:35:31.800853 master-0 kubenswrapper[13205]: I0319 09:35:31.800788 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 09:35:31.811146 master-0 kubenswrapper[13205]: I0319 09:35:31.811086 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 09:35:31.814609 master-0 kubenswrapper[13205]: I0319 09:35:31.814564 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-jvtc2" Mar 19 09:35:31.823548 master-0 kubenswrapper[13205]: I0319 09:35:31.823507 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 09:35:31.842817 master-0 kubenswrapper[13205]: I0319 09:35:31.842751 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:35:31.960860 master-0 kubenswrapper[13205]: I0319 09:35:31.960622 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:35:31.999465 master-0 kubenswrapper[13205]: I0319 09:35:31.999370 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 09:35:32.027018 master-0 kubenswrapper[13205]: I0319 09:35:32.026925 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 09:35:32.147853 master-0 kubenswrapper[13205]: I0319 09:35:32.147803 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-7dv6h" Mar 19 09:35:32.241698 master-0 kubenswrapper[13205]: I0319 09:35:32.241577 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:35:32.313296 master-0 kubenswrapper[13205]: I0319 09:35:32.313208 13205 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:35:32.347217 master-0 kubenswrapper[13205]: I0319 09:35:32.347153 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 09:35:32.362763 master-0 kubenswrapper[13205]: I0319 09:35:32.362691 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-avpd2mlhiq4t" Mar 19 09:35:32.599278 master-0 kubenswrapper[13205]: I0319 09:35:32.599154 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:35:32.600930 master-0 kubenswrapper[13205]: I0319 09:35:32.600865 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:35:32.610723 master-0 kubenswrapper[13205]: I0319 09:35:32.610632 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:35:32.636233 master-0 kubenswrapper[13205]: I0319 09:35:32.636142 13205 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:35:32.638321 master-0 kubenswrapper[13205]: I0319 09:35:32.638146 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=38.638122377 podStartE2EDuration="38.638122377s" podCreationTimestamp="2026-03-19 09:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:35:17.860361159 +0000 UTC m=+703.192668047" watchObservedRunningTime="2026-03-19 09:35:32.638122377 +0000 UTC m=+717.970429295" Mar 19 09:35:32.645758 master-0 kubenswrapper[13205]: I0319 09:35:32.645697 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:35:32.645956 master-0 kubenswrapper[13205]: I0319 09:35:32.645779 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:35:32.646367 master-0 kubenswrapper[13205]: I0319 09:35:32.646293 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:35:32.646367 master-0 kubenswrapper[13205]: I0319 09:35:32.646349 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="3eb0da85-1f1f-4254-8391-afd92f67bb26" Mar 19 09:35:32.653202 master-0 kubenswrapper[13205]: I0319 09:35:32.651425 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:35:32.655215 master-0 kubenswrapper[13205]: I0319 09:35:32.655164 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:35:32.696692 master-0 kubenswrapper[13205]: I0319 09:35:32.696612 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 09:35:32.697697 master-0 kubenswrapper[13205]: I0319 09:35:32.697041 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=15.697018115 podStartE2EDuration="15.697018115s" podCreationTimestamp="2026-03-19 09:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:35:32.689054739 +0000 UTC m=+718.021361687" watchObservedRunningTime="2026-03-19 09:35:32.697018115 +0000 UTC m=+718.029325043" Mar 19 09:35:32.740063 master-0 kubenswrapper[13205]: I0319 09:35:32.740007 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 09:35:32.756178 master-0 kubenswrapper[13205]: I0319 09:35:32.756142 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:35:32.776723 master-0 kubenswrapper[13205]: I0319 09:35:32.776041 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-xh8t6" Mar 19 09:35:32.780145 master-0 kubenswrapper[13205]: I0319 09:35:32.780082 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=15.780059887 podStartE2EDuration="15.780059887s" podCreationTimestamp="2026-03-19 09:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:35:32.774436319 +0000 UTC m=+718.106743207" watchObservedRunningTime="2026-03-19 09:35:32.780059887 +0000 UTC m=+718.112366775" Mar 19 09:35:32.843631 master-0 kubenswrapper[13205]: I0319 09:35:32.843558 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:35:33.016675 master-0 kubenswrapper[13205]: I0319 09:35:33.016604 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:35:33.044677 master-0 kubenswrapper[13205]: I0319 09:35:33.044606 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:35:33.071736 master-0 kubenswrapper[13205]: I0319 09:35:33.071681 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:35:33.076009 master-0 kubenswrapper[13205]: I0319 09:35:33.075970 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 09:35:33.193485 master-0 kubenswrapper[13205]: I0319 09:35:33.193412 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 09:35:33.308504 master-0 kubenswrapper[13205]: I0319 09:35:33.308385 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-gpc6r" Mar 19 09:35:33.322979 master-0 kubenswrapper[13205]: I0319 09:35:33.322933 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-4wm5n" Mar 19 09:35:33.337540 master-0 kubenswrapper[13205]: I0319 09:35:33.337490 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 09:35:33.361231 master-0 kubenswrapper[13205]: I0319 09:35:33.361148 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:35:33.400182 master-0 kubenswrapper[13205]: I0319 09:35:33.400089 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:35:33.400928 master-0 kubenswrapper[13205]: I0319 09:35:33.400866 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 09:35:33.403576 master-0 kubenswrapper[13205]: I0319 09:35:33.403512 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 09:35:33.422929 master-0 kubenswrapper[13205]: I0319 09:35:33.422835 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 09:35:33.453443 master-0 kubenswrapper[13205]: I0319 09:35:33.453353 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:35:33.559312 master-0 kubenswrapper[13205]: I0319 09:35:33.559124 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 09:35:33.653302 master-0 kubenswrapper[13205]: I0319 09:35:33.653231 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-fdmh2" Mar 19 09:35:33.654565 master-0 kubenswrapper[13205]: I0319 09:35:33.654460 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:35:33.690506 master-0 kubenswrapper[13205]: I0319 09:35:33.690422 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:35:33.710426 master-0 kubenswrapper[13205]: I0319 09:35:33.710351 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 09:35:33.711180 master-0 kubenswrapper[13205]: I0319 09:35:33.711126 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 09:35:33.777632 master-0 kubenswrapper[13205]: I0319 09:35:33.777555 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 09:35:33.799616 master-0 kubenswrapper[13205]: I0319 09:35:33.799563 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:35:33.839956 master-0 kubenswrapper[13205]: I0319 09:35:33.839819 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 09:35:33.935128 master-0 kubenswrapper[13205]: I0319 09:35:33.935030 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 09:35:33.974994 master-0 kubenswrapper[13205]: I0319 09:35:33.974882 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 09:35:34.078605 master-0 kubenswrapper[13205]: I0319 09:35:34.078550 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-xnqzt" Mar 19 09:35:34.079267 master-0 kubenswrapper[13205]: I0319 09:35:34.079215 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-s8fs4" Mar 19 09:35:34.085742 master-0 kubenswrapper[13205]: I0319 09:35:34.085674 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:35:34.155392 master-0 kubenswrapper[13205]: I0319 09:35:34.155190 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:35:34.209782 master-0 kubenswrapper[13205]: I0319 09:35:34.209686 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 09:35:34.422929 master-0 kubenswrapper[13205]: I0319 09:35:34.422785 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:35:34.513972 master-0 kubenswrapper[13205]: I0319 09:35:34.513859 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:35:34.560505 master-0 kubenswrapper[13205]: I0319 09:35:34.560417 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:35:34.573311 master-0 kubenswrapper[13205]: I0319 09:35:34.573239 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 09:35:34.775890 master-0 kubenswrapper[13205]: I0319 09:35:34.775804 13205 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:35:34.804857 master-0 kubenswrapper[13205]: I0319 09:35:34.804772 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:35:34.828837 master-0 kubenswrapper[13205]: I0319 09:35:34.828753 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 09:35:34.844112 master-0 kubenswrapper[13205]: I0319 09:35:34.843993 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 09:35:34.928965 master-0 kubenswrapper[13205]: I0319 09:35:34.928880 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:35:34.955612 master-0 kubenswrapper[13205]: I0319 09:35:34.955545 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 09:35:34.988572 master-0 kubenswrapper[13205]: I0319 09:35:34.988326 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 09:35:35.174082 master-0 kubenswrapper[13205]: I0319 09:35:35.173926 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 09:35:35.195120 master-0 kubenswrapper[13205]: I0319 09:35:35.195039 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:35:35.208814 master-0 kubenswrapper[13205]: E0319 09:35:35.208737 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:35:35.215485 master-0 kubenswrapper[13205]: I0319 09:35:35.215418 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-rcdtx" Mar 19 09:35:35.313549 master-0 kubenswrapper[13205]: I0319 09:35:35.313443 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 09:35:35.339732 master-0 kubenswrapper[13205]: I0319 09:35:35.339678 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 09:35:35.427402 master-0 kubenswrapper[13205]: I0319 09:35:35.427257 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:35:35.433248 master-0 kubenswrapper[13205]: I0319 09:35:35.433178 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 09:35:35.448609 master-0 kubenswrapper[13205]: I0319 09:35:35.448485 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:35:35.470128 master-0 kubenswrapper[13205]: I0319 09:35:35.470037 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 09:35:35.633830 master-0 kubenswrapper[13205]: I0319 09:35:35.633782 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:35:35.635698 master-0 kubenswrapper[13205]: I0319 09:35:35.635673 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-lpljf" Mar 19 09:35:35.717292 master-0 kubenswrapper[13205]: I0319 09:35:35.717160 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:35:35.753433 master-0 kubenswrapper[13205]: I0319 09:35:35.753345 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:35:35.877469 master-0 kubenswrapper[13205]: I0319 09:35:35.877437 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:35:35.896758 master-0 kubenswrapper[13205]: I0319 09:35:35.896720 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:35:35.898475 master-0 kubenswrapper[13205]: I0319 09:35:35.898446 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-9tw96" Mar 19 09:35:35.900385 master-0 kubenswrapper[13205]: I0319 09:35:35.900370 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:35:35.914847 master-0 kubenswrapper[13205]: I0319 09:35:35.914820 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 09:35:35.915243 master-0 kubenswrapper[13205]: I0319 09:35:35.915229 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:35:35.962171 master-0 kubenswrapper[13205]: I0319 09:35:35.962103 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 09:35:36.030400 master-0 kubenswrapper[13205]: I0319 09:35:36.028334 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:35:36.043551 master-0 kubenswrapper[13205]: I0319 09:35:36.043493 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-s9ktx" Mar 19 09:35:36.051887 master-0 kubenswrapper[13205]: I0319 09:35:36.051843 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 09:35:36.083806 master-0 kubenswrapper[13205]: I0319 09:35:36.083747 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:35:36.136318 master-0 kubenswrapper[13205]: I0319 09:35:36.136256 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:35:36.177320 master-0 kubenswrapper[13205]: I0319 09:35:36.177264 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-mvv8v" Mar 19 09:35:36.242262 master-0 kubenswrapper[13205]: I0319 09:35:36.242184 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:35:36.357025 master-0 kubenswrapper[13205]: I0319 09:35:36.356905 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 09:35:36.473173 master-0 kubenswrapper[13205]: I0319 09:35:36.473066 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 09:35:36.522581 master-0 kubenswrapper[13205]: I0319 09:35:36.522492 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:35:36.525136 master-0 kubenswrapper[13205]: I0319 09:35:36.525087 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-2rqbc" Mar 19 09:35:36.579986 master-0 kubenswrapper[13205]: I0319 09:35:36.579927 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 09:35:36.616648 master-0 kubenswrapper[13205]: I0319 09:35:36.616423 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 09:35:36.682213 master-0 kubenswrapper[13205]: I0319 09:35:36.682145 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-6j7vofh1gbciq" Mar 19 09:35:36.684566 master-0 kubenswrapper[13205]: I0319 09:35:36.684545 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 09:35:36.746211 master-0 kubenswrapper[13205]: I0319 09:35:36.746148 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 09:35:36.749022 master-0 kubenswrapper[13205]: I0319 09:35:36.748953 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-2zxtq" Mar 19 09:35:36.756314 master-0 kubenswrapper[13205]: I0319 09:35:36.756276 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:35:36.791963 master-0 kubenswrapper[13205]: I0319 09:35:36.791908 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 09:35:36.802090 master-0 kubenswrapper[13205]: I0319 09:35:36.802027 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:35:36.803245 master-0 kubenswrapper[13205]: I0319 09:35:36.803199 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 09:35:36.893916 master-0 kubenswrapper[13205]: I0319 09:35:36.893801 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:35:36.966827 master-0 kubenswrapper[13205]: I0319 09:35:36.966777 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:35:36.975625 master-0 kubenswrapper[13205]: I0319 09:35:36.975513 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:35:36.994316 master-0 kubenswrapper[13205]: I0319 09:35:36.994275 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-qttf4" Mar 19 09:35:37.103961 master-0 kubenswrapper[13205]: I0319 09:35:37.103891 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:35:37.144157 master-0 kubenswrapper[13205]: I0319 09:35:37.144025 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:35:37.144157 master-0 kubenswrapper[13205]: I0319 09:35:37.144098 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:35:37.161660 master-0 kubenswrapper[13205]: I0319 09:35:37.161599 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-45zpl" Mar 19 09:35:37.161811 master-0 kubenswrapper[13205]: I0319 09:35:37.161667 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:35:37.192260 master-0 kubenswrapper[13205]: I0319 09:35:37.192198 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 09:35:37.236659 master-0 kubenswrapper[13205]: I0319 09:35:37.236564 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:35:37.397921 master-0 kubenswrapper[13205]: I0319 09:35:37.397742 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-zbzf5" Mar 19 09:35:37.411318 master-0 kubenswrapper[13205]: I0319 09:35:37.411254 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:35:37.446497 master-0 kubenswrapper[13205]: I0319 09:35:37.446436 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 09:35:37.499035 master-0 kubenswrapper[13205]: I0319 09:35:37.498978 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 09:35:37.552010 master-0 kubenswrapper[13205]: I0319 09:35:37.551937 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 09:35:37.564039 master-0 kubenswrapper[13205]: I0319 09:35:37.563968 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 09:35:37.574679 master-0 kubenswrapper[13205]: I0319 09:35:37.574620 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:35:37.620516 master-0 kubenswrapper[13205]: I0319 09:35:37.620437 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 09:35:37.664119 master-0 kubenswrapper[13205]: I0319 09:35:37.663991 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:35:37.809559 master-0 kubenswrapper[13205]: I0319 09:35:37.809479 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 09:35:37.900370 master-0 kubenswrapper[13205]: I0319 09:35:37.900272 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 09:35:37.900893 master-0 kubenswrapper[13205]: I0319 09:35:37.900415 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 09:35:37.924773 master-0 kubenswrapper[13205]: I0319 09:35:37.924650 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 09:35:37.931065 master-0 kubenswrapper[13205]: I0319 09:35:37.931012 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:35:37.995149 master-0 kubenswrapper[13205]: I0319 09:35:37.995056 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-bxhvs" Mar 19 09:35:38.079133 master-0 kubenswrapper[13205]: I0319 09:35:38.074033 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:35:38.131130 master-0 kubenswrapper[13205]: I0319 09:35:38.131071 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 09:35:38.152919 master-0 kubenswrapper[13205]: I0319 09:35:38.152870 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:35:38.156260 master-0 kubenswrapper[13205]: I0319 09:35:38.156226 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:35:38.198368 master-0 kubenswrapper[13205]: I0319 09:35:38.198245 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:35:38.250197 master-0 kubenswrapper[13205]: I0319 09:35:38.250124 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 09:35:38.287511 master-0 kubenswrapper[13205]: I0319 09:35:38.287464 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 09:35:38.320723 master-0 kubenswrapper[13205]: I0319 09:35:38.320664 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gfzzh" Mar 19 09:35:38.535880 master-0 kubenswrapper[13205]: I0319 09:35:38.535839 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 09:35:38.787254 master-0 kubenswrapper[13205]: I0319 09:35:38.787119 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:35:38.818799 master-0 kubenswrapper[13205]: I0319 09:35:38.818726 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:35:38.836872 master-0 kubenswrapper[13205]: I0319 09:35:38.836813 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:35:38.862215 master-0 kubenswrapper[13205]: I0319 09:35:38.862159 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 09:35:38.866923 master-0 kubenswrapper[13205]: I0319 09:35:38.866863 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 09:35:38.878723 master-0 kubenswrapper[13205]: I0319 09:35:38.878680 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:35:38.898950 master-0 kubenswrapper[13205]: I0319 09:35:38.898900 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:35:38.989238 master-0 kubenswrapper[13205]: I0319 09:35:38.989164 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 09:35:39.128007 master-0 kubenswrapper[13205]: I0319 09:35:39.127881 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 09:35:39.201895 master-0 kubenswrapper[13205]: I0319 09:35:39.201831 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:35:39.222481 master-0 kubenswrapper[13205]: I0319 09:35:39.222397 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 09:35:39.293660 master-0 kubenswrapper[13205]: I0319 09:35:39.293593 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:35:39.565458 master-0 kubenswrapper[13205]: I0319 09:35:39.565365 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 09:35:39.620738 master-0 kubenswrapper[13205]: I0319 09:35:39.620685 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:35:39.681463 master-0 kubenswrapper[13205]: I0319 09:35:39.681404 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 09:35:39.708934 master-0 kubenswrapper[13205]: I0319 09:35:39.708878 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:35:39.709302 master-0 kubenswrapper[13205]: I0319 09:35:39.709270 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:35:39.766185 master-0 kubenswrapper[13205]: I0319 09:35:39.766128 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:35:39.782183 master-0 kubenswrapper[13205]: I0319 09:35:39.782111 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-r2bsk" Mar 19 09:35:40.074883 master-0 kubenswrapper[13205]: I0319 09:35:40.074785 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:35:40.135443 master-0 kubenswrapper[13205]: I0319 09:35:40.135374 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-tv2z8" Mar 19 09:35:40.137870 master-0 kubenswrapper[13205]: I0319 09:35:40.137833 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 09:35:40.170980 master-0 kubenswrapper[13205]: I0319 09:35:40.170905 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 09:35:40.276309 master-0 kubenswrapper[13205]: I0319 09:35:40.276267 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 09:35:40.282424 master-0 kubenswrapper[13205]: I0319 09:35:40.282386 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 09:35:40.287441 master-0 kubenswrapper[13205]: I0319 09:35:40.287401 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:35:40.311293 master-0 kubenswrapper[13205]: I0319 09:35:40.311243 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:35:40.311584 master-0 kubenswrapper[13205]: I0319 09:35:40.311494 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" containerName="startup-monitor" containerID="cri-o://d9b5e7b16faa2c1a873c5ff4df811f271a8f4f406904c27851d6b3b7a3d9065c" gracePeriod=5 Mar 19 09:35:40.331801 master-0 kubenswrapper[13205]: I0319 09:35:40.331708 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-2svn2" Mar 19 09:35:40.338951 master-0 kubenswrapper[13205]: I0319 09:35:40.338925 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-99v25" Mar 19 09:35:40.380722 master-0 kubenswrapper[13205]: I0319 09:35:40.380660 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:35:40.451869 master-0 kubenswrapper[13205]: I0319 09:35:40.451810 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 09:35:40.524057 master-0 kubenswrapper[13205]: I0319 09:35:40.523944 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:35:40.550974 master-0 kubenswrapper[13205]: I0319 09:35:40.550873 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 09:35:40.616470 master-0 kubenswrapper[13205]: I0319 09:35:40.616292 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 09:35:40.660424 master-0 kubenswrapper[13205]: I0319 09:35:40.660338 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 09:35:40.670983 master-0 kubenswrapper[13205]: I0319 09:35:40.670928 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 09:35:40.690058 master-0 kubenswrapper[13205]: I0319 09:35:40.690008 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:35:40.770717 master-0 kubenswrapper[13205]: I0319 09:35:40.770681 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:35:40.843961 master-0 kubenswrapper[13205]: I0319 09:35:40.843890 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 09:35:40.945192 master-0 kubenswrapper[13205]: I0319 09:35:40.945090 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 09:35:40.984288 master-0 kubenswrapper[13205]: I0319 09:35:40.984210 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:35:41.025639 master-0 kubenswrapper[13205]: I0319 09:35:41.025597 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:35:41.049952 master-0 kubenswrapper[13205]: I0319 09:35:41.049910 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 09:35:41.080227 master-0 kubenswrapper[13205]: I0319 09:35:41.080184 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:35:41.211202 master-0 kubenswrapper[13205]: I0319 09:35:41.211084 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 09:35:41.230199 master-0 kubenswrapper[13205]: I0319 09:35:41.230150 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:35:41.260316 master-0 kubenswrapper[13205]: I0319 09:35:41.260260 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 09:35:41.307873 master-0 kubenswrapper[13205]: I0319 09:35:41.307833 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 09:35:41.318186 master-0 kubenswrapper[13205]: I0319 09:35:41.318149 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:35:41.322488 master-0 kubenswrapper[13205]: I0319 09:35:41.322448 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 09:35:41.403644 master-0 kubenswrapper[13205]: I0319 09:35:41.403494 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 09:35:41.412180 master-0 kubenswrapper[13205]: I0319 09:35:41.412131 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 09:35:41.415629 master-0 kubenswrapper[13205]: I0319 09:35:41.415600 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 09:35:41.470333 master-0 kubenswrapper[13205]: I0319 09:35:41.470187 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:35:41.526445 master-0 kubenswrapper[13205]: I0319 09:35:41.526382 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-fmrxp" Mar 19 09:35:41.537576 master-0 kubenswrapper[13205]: I0319 09:35:41.537501 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 09:35:41.658036 master-0 kubenswrapper[13205]: I0319 09:35:41.657959 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:35:41.716183 master-0 kubenswrapper[13205]: I0319 09:35:41.716128 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 09:35:41.808188 master-0 kubenswrapper[13205]: I0319 09:35:41.808139 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 09:35:41.823885 master-0 kubenswrapper[13205]: I0319 09:35:41.823830 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 09:35:41.841834 master-0 kubenswrapper[13205]: I0319 09:35:41.841759 13205 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:35:41.975154 master-0 kubenswrapper[13205]: I0319 09:35:41.975097 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 09:35:42.200932 master-0 kubenswrapper[13205]: I0319 09:35:42.200816 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 09:35:42.340259 master-0 kubenswrapper[13205]: I0319 09:35:42.340181 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:35:42.391569 master-0 kubenswrapper[13205]: I0319 09:35:42.391478 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-p5dd4" Mar 19 09:35:42.393486 master-0 kubenswrapper[13205]: I0319 09:35:42.393412 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 09:35:42.481863 master-0 kubenswrapper[13205]: I0319 09:35:42.481733 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 09:35:42.493410 master-0 kubenswrapper[13205]: I0319 09:35:42.493358 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 09:35:42.629692 master-0 kubenswrapper[13205]: I0319 09:35:42.629639 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 09:35:42.645729 master-0 kubenswrapper[13205]: I0319 09:35:42.645677 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:35:42.689808 master-0 kubenswrapper[13205]: I0319 09:35:42.689745 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 09:35:42.794289 master-0 kubenswrapper[13205]: I0319 09:35:42.794233 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 09:35:42.812761 master-0 kubenswrapper[13205]: I0319 09:35:42.812707 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 09:35:42.814645 master-0 kubenswrapper[13205]: I0319 09:35:42.814608 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:35:42.830750 master-0 kubenswrapper[13205]: I0319 09:35:42.830663 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 09:35:42.981093 master-0 kubenswrapper[13205]: I0319 09:35:42.981056 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:35:42.993042 master-0 kubenswrapper[13205]: I0319 09:35:42.992996 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:35:43.017334 master-0 kubenswrapper[13205]: I0319 09:35:43.017285 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:35:43.167591 master-0 kubenswrapper[13205]: I0319 09:35:43.167415 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-d89qv" Mar 19 09:35:45.915025 master-0 kubenswrapper[13205]: I0319 09:35:45.914970 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_ebbfbf2b56df0323ba118d68bfdad8b9/startup-monitor/0.log" Mar 19 09:35:45.916241 master-0 kubenswrapper[13205]: I0319 09:35:45.916019 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:35:46.061912 master-0 kubenswrapper[13205]: I0319 09:35:46.061850 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 09:35:46.061912 master-0 kubenswrapper[13205]: I0319 09:35:46.061919 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 09:35:46.062168 master-0 kubenswrapper[13205]: I0319 09:35:46.062009 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 09:35:46.062168 master-0 kubenswrapper[13205]: I0319 09:35:46.062036 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 09:35:46.062168 master-0 kubenswrapper[13205]: I0319 09:35:46.062081 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 09:35:46.062168 master-0 kubenswrapper[13205]: I0319 09:35:46.062134 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock" (OuterVolumeSpecName: "var-lock") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:35:46.062168 master-0 kubenswrapper[13205]: I0319 09:35:46.062139 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:35:46.062452 master-0 kubenswrapper[13205]: I0319 09:35:46.062274 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log" (OuterVolumeSpecName: "var-log") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:35:46.062452 master-0 kubenswrapper[13205]: I0319 09:35:46.062263 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests" (OuterVolumeSpecName: "manifests") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:35:46.062705 master-0 kubenswrapper[13205]: I0319 09:35:46.062629 13205 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:46.062705 master-0 kubenswrapper[13205]: I0319 09:35:46.062662 13205 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:46.062705 master-0 kubenswrapper[13205]: I0319 09:35:46.062671 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:46.062705 master-0 kubenswrapper[13205]: I0319 09:35:46.062705 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:46.067601 master-0 kubenswrapper[13205]: I0319 09:35:46.067514 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:35:46.100700 master-0 kubenswrapper[13205]: I0319 09:35:46.100546 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_ebbfbf2b56df0323ba118d68bfdad8b9/startup-monitor/0.log" Mar 19 09:35:46.100700 master-0 kubenswrapper[13205]: I0319 09:35:46.100614 13205 generic.go:334] "Generic (PLEG): container finished" podID="ebbfbf2b56df0323ba118d68bfdad8b9" containerID="d9b5e7b16faa2c1a873c5ff4df811f271a8f4f406904c27851d6b3b7a3d9065c" exitCode=137 Mar 19 09:35:46.100700 master-0 kubenswrapper[13205]: I0319 09:35:46.100661 13205 scope.go:117] "RemoveContainer" containerID="d9b5e7b16faa2c1a873c5ff4df811f271a8f4f406904c27851d6b3b7a3d9065c" Mar 19 09:35:46.100700 master-0 kubenswrapper[13205]: I0319 09:35:46.100664 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:35:46.117487 master-0 kubenswrapper[13205]: I0319 09:35:46.117447 13205 scope.go:117] "RemoveContainer" containerID="d9b5e7b16faa2c1a873c5ff4df811f271a8f4f406904c27851d6b3b7a3d9065c" Mar 19 09:35:46.117960 master-0 kubenswrapper[13205]: E0319 09:35:46.117912 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9b5e7b16faa2c1a873c5ff4df811f271a8f4f406904c27851d6b3b7a3d9065c\": container with ID starting with d9b5e7b16faa2c1a873c5ff4df811f271a8f4f406904c27851d6b3b7a3d9065c not found: ID does not exist" containerID="d9b5e7b16faa2c1a873c5ff4df811f271a8f4f406904c27851d6b3b7a3d9065c" Mar 19 09:35:46.118024 master-0 kubenswrapper[13205]: I0319 09:35:46.117961 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b5e7b16faa2c1a873c5ff4df811f271a8f4f406904c27851d6b3b7a3d9065c"} err="failed to get container status \"d9b5e7b16faa2c1a873c5ff4df811f271a8f4f406904c27851d6b3b7a3d9065c\": rpc error: code = NotFound desc = could not find container \"d9b5e7b16faa2c1a873c5ff4df811f271a8f4f406904c27851d6b3b7a3d9065c\": container with ID starting with d9b5e7b16faa2c1a873c5ff4df811f271a8f4f406904c27851d6b3b7a3d9065c not found: ID does not exist" Mar 19 09:35:46.164325 master-0 kubenswrapper[13205]: I0319 09:35:46.164207 13205 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:46.863999 master-0 kubenswrapper[13205]: I0319 09:35:46.863907 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" path="/var/lib/kubelet/pods/ebbfbf2b56df0323ba118d68bfdad8b9/volumes" Mar 19 09:35:46.864465 master-0 kubenswrapper[13205]: I0319 09:35:46.864424 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 19 09:35:46.882816 master-0 kubenswrapper[13205]: I0319 09:35:46.882727 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:35:46.882816 master-0 kubenswrapper[13205]: I0319 09:35:46.882795 13205 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="37e89777-4be7-40dd-a552-e49a8de5320e" Mar 19 09:35:46.890145 master-0 kubenswrapper[13205]: I0319 09:35:46.890065 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:35:46.890145 master-0 kubenswrapper[13205]: I0319 09:35:46.890135 13205 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="37e89777-4be7-40dd-a552-e49a8de5320e" Mar 19 09:35:47.144627 master-0 kubenswrapper[13205]: I0319 09:35:47.144418 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:35:47.144627 master-0 kubenswrapper[13205]: I0319 09:35:47.144492 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:35:57.144226 master-0 kubenswrapper[13205]: I0319 09:35:57.144105 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:35:57.144226 master-0 kubenswrapper[13205]: I0319 09:35:57.144223 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:35:58.349302 master-0 kubenswrapper[13205]: I0319 09:35:58.349224 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:36:07.144538 master-0 kubenswrapper[13205]: I0319 09:36:07.144426 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:36:07.145200 master-0 kubenswrapper[13205]: I0319 09:36:07.145166 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:36:08.889280 master-0 kubenswrapper[13205]: I0319 09:36:08.889252 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 09:36:17.144047 master-0 kubenswrapper[13205]: I0319 09:36:17.143971 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:36:17.144713 master-0 kubenswrapper[13205]: I0319 09:36:17.144063 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:36:17.428595 master-0 kubenswrapper[13205]: I0319 09:36:17.425616 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 09:36:24.320595 master-0 kubenswrapper[13205]: I0319 09:36:24.320502 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-92z97" Mar 19 09:36:24.787702 master-0 kubenswrapper[13205]: I0319 09:36:24.787638 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:36:27.144261 master-0 kubenswrapper[13205]: I0319 09:36:27.144163 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:36:27.144765 master-0 kubenswrapper[13205]: I0319 09:36:27.144267 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:36:35.225273 master-0 kubenswrapper[13205]: E0319 09:36:35.225221 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:36:37.143940 master-0 kubenswrapper[13205]: I0319 09:36:37.143864 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:36:37.143940 master-0 kubenswrapper[13205]: I0319 09:36:37.143927 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:36:47.144336 master-0 kubenswrapper[13205]: I0319 09:36:47.144281 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:36:47.145176 master-0 kubenswrapper[13205]: I0319 09:36:47.145142 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:36:57.144430 master-0 kubenswrapper[13205]: I0319 09:36:57.144372 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:36:57.145124 master-0 kubenswrapper[13205]: I0319 09:36:57.144435 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:37:07.144591 master-0 kubenswrapper[13205]: I0319 09:37:07.144511 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:37:07.145219 master-0 kubenswrapper[13205]: I0319 09:37:07.144607 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:37:08.835960 master-0 kubenswrapper[13205]: I0319 09:37:08.835885 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/revision-pruner-7-master-0"] Mar 19 09:37:08.842471 master-0 kubenswrapper[13205]: E0319 09:37:08.836219 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" containerName="startup-monitor" Mar 19 09:37:08.842471 master-0 kubenswrapper[13205]: I0319 09:37:08.836238 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" containerName="startup-monitor" Mar 19 09:37:08.842471 master-0 kubenswrapper[13205]: E0319 09:37:08.836253 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" containerName="installer" Mar 19 09:37:08.842471 master-0 kubenswrapper[13205]: I0319 09:37:08.836261 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" containerName="installer" Mar 19 09:37:08.842471 master-0 kubenswrapper[13205]: I0319 09:37:08.836414 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2139b7-8af8-4294-aee2-3e7429d2b1fe" containerName="installer" Mar 19 09:37:08.842471 master-0 kubenswrapper[13205]: I0319 09:37:08.836431 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" containerName="startup-monitor" Mar 19 09:37:08.842471 master-0 kubenswrapper[13205]: I0319 09:37:08.836833 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 19 09:37:08.842471 master-0 kubenswrapper[13205]: I0319 09:37:08.840625 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 09:37:08.842471 master-0 kubenswrapper[13205]: I0319 09:37:08.840825 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-xzz4b" Mar 19 09:37:08.856931 master-0 kubenswrapper[13205]: I0319 09:37:08.856887 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-7-master-0"] Mar 19 09:37:08.963753 master-0 kubenswrapper[13205]: I0319 09:37:08.963710 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26a49eb-85ac-47e1-b098-557f6e625958-kube-api-access\") pod \"revision-pruner-7-master-0\" (UID: \"d26a49eb-85ac-47e1-b098-557f6e625958\") " pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 19 09:37:08.964042 master-0 kubenswrapper[13205]: I0319 09:37:08.964019 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26a49eb-85ac-47e1-b098-557f6e625958-kubelet-dir\") pod \"revision-pruner-7-master-0\" (UID: \"d26a49eb-85ac-47e1-b098-557f6e625958\") " pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 19 09:37:09.065990 master-0 kubenswrapper[13205]: I0319 09:37:09.065767 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26a49eb-85ac-47e1-b098-557f6e625958-kube-api-access\") pod \"revision-pruner-7-master-0\" (UID: \"d26a49eb-85ac-47e1-b098-557f6e625958\") " pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 19 09:37:09.065990 master-0 kubenswrapper[13205]: I0319 09:37:09.065885 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26a49eb-85ac-47e1-b098-557f6e625958-kubelet-dir\") pod \"revision-pruner-7-master-0\" (UID: \"d26a49eb-85ac-47e1-b098-557f6e625958\") " pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 19 09:37:09.066258 master-0 kubenswrapper[13205]: I0319 09:37:09.066020 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26a49eb-85ac-47e1-b098-557f6e625958-kubelet-dir\") pod \"revision-pruner-7-master-0\" (UID: \"d26a49eb-85ac-47e1-b098-557f6e625958\") " pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 19 09:37:09.089663 master-0 kubenswrapper[13205]: I0319 09:37:09.089559 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26a49eb-85ac-47e1-b098-557f6e625958-kube-api-access\") pod \"revision-pruner-7-master-0\" (UID: \"d26a49eb-85ac-47e1-b098-557f6e625958\") " pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 19 09:37:09.221547 master-0 kubenswrapper[13205]: I0319 09:37:09.221474 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 19 09:37:09.677648 master-0 kubenswrapper[13205]: I0319 09:37:09.677458 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-7-master-0"] Mar 19 09:37:10.686746 master-0 kubenswrapper[13205]: I0319 09:37:10.686683 13205 generic.go:334] "Generic (PLEG): container finished" podID="d26a49eb-85ac-47e1-b098-557f6e625958" containerID="426788834fbbe7b88673fcca7f7c21664e06a342e9241e58a69e3a89a731cf18" exitCode=0 Mar 19 09:37:10.686746 master-0 kubenswrapper[13205]: I0319 09:37:10.686736 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-7-master-0" event={"ID":"d26a49eb-85ac-47e1-b098-557f6e625958","Type":"ContainerDied","Data":"426788834fbbe7b88673fcca7f7c21664e06a342e9241e58a69e3a89a731cf18"} Mar 19 09:37:10.688115 master-0 kubenswrapper[13205]: I0319 09:37:10.686763 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-7-master-0" event={"ID":"d26a49eb-85ac-47e1-b098-557f6e625958","Type":"ContainerStarted","Data":"3c5bcde2cba86d99ae6141a6b21ed516111c2d608d9387d93c3c7bd31f9c8dba"} Mar 19 09:37:10.918953 master-0 kubenswrapper[13205]: I0319 09:37:10.918825 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-7-master-0"] Mar 19 09:37:10.920347 master-0 kubenswrapper[13205]: I0319 09:37:10.920312 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:10.939918 master-0 kubenswrapper[13205]: I0319 09:37:10.939852 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-7-master-0"] Mar 19 09:37:10.989784 master-0 kubenswrapper[13205]: I0319 09:37:10.989734 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e546c2e-a198-4141-a1de-518b8d71d107-var-lock\") pod \"installer-7-master-0\" (UID: \"7e546c2e-a198-4141-a1de-518b8d71d107\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:10.989987 master-0 kubenswrapper[13205]: I0319 09:37:10.989808 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e546c2e-a198-4141-a1de-518b8d71d107-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7e546c2e-a198-4141-a1de-518b8d71d107\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:10.989987 master-0 kubenswrapper[13205]: I0319 09:37:10.989878 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e546c2e-a198-4141-a1de-518b8d71d107-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"7e546c2e-a198-4141-a1de-518b8d71d107\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:11.091443 master-0 kubenswrapper[13205]: I0319 09:37:11.091362 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e546c2e-a198-4141-a1de-518b8d71d107-var-lock\") pod \"installer-7-master-0\" (UID: \"7e546c2e-a198-4141-a1de-518b8d71d107\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:11.091746 master-0 kubenswrapper[13205]: I0319 09:37:11.091483 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e546c2e-a198-4141-a1de-518b8d71d107-var-lock\") pod \"installer-7-master-0\" (UID: \"7e546c2e-a198-4141-a1de-518b8d71d107\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:11.091746 master-0 kubenswrapper[13205]: I0319 09:37:11.091492 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e546c2e-a198-4141-a1de-518b8d71d107-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7e546c2e-a198-4141-a1de-518b8d71d107\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:11.091746 master-0 kubenswrapper[13205]: I0319 09:37:11.091580 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e546c2e-a198-4141-a1de-518b8d71d107-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"7e546c2e-a198-4141-a1de-518b8d71d107\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:11.091746 master-0 kubenswrapper[13205]: I0319 09:37:11.091673 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e546c2e-a198-4141-a1de-518b8d71d107-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"7e546c2e-a198-4141-a1de-518b8d71d107\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:11.108450 master-0 kubenswrapper[13205]: I0319 09:37:11.108392 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e546c2e-a198-4141-a1de-518b8d71d107-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7e546c2e-a198-4141-a1de-518b8d71d107\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:11.245638 master-0 kubenswrapper[13205]: I0319 09:37:11.245487 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:11.701823 master-0 kubenswrapper[13205]: I0319 09:37:11.701766 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-7-master-0"] Mar 19 09:37:12.033667 master-0 kubenswrapper[13205]: I0319 09:37:12.033624 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 19 09:37:12.209632 master-0 kubenswrapper[13205]: I0319 09:37:12.209507 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26a49eb-85ac-47e1-b098-557f6e625958-kubelet-dir\") pod \"d26a49eb-85ac-47e1-b098-557f6e625958\" (UID: \"d26a49eb-85ac-47e1-b098-557f6e625958\") " Mar 19 09:37:12.209829 master-0 kubenswrapper[13205]: I0319 09:37:12.209634 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d26a49eb-85ac-47e1-b098-557f6e625958-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d26a49eb-85ac-47e1-b098-557f6e625958" (UID: "d26a49eb-85ac-47e1-b098-557f6e625958"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:12.209829 master-0 kubenswrapper[13205]: I0319 09:37:12.209735 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26a49eb-85ac-47e1-b098-557f6e625958-kube-api-access\") pod \"d26a49eb-85ac-47e1-b098-557f6e625958\" (UID: \"d26a49eb-85ac-47e1-b098-557f6e625958\") " Mar 19 09:37:12.210062 master-0 kubenswrapper[13205]: I0319 09:37:12.210033 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d26a49eb-85ac-47e1-b098-557f6e625958-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:12.217845 master-0 kubenswrapper[13205]: I0319 09:37:12.217785 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26a49eb-85ac-47e1-b098-557f6e625958-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d26a49eb-85ac-47e1-b098-557f6e625958" (UID: "d26a49eb-85ac-47e1-b098-557f6e625958"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:37:12.311875 master-0 kubenswrapper[13205]: I0319 09:37:12.311783 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d26a49eb-85ac-47e1-b098-557f6e625958-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:12.706444 master-0 kubenswrapper[13205]: I0319 09:37:12.706338 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-7-master-0" event={"ID":"d26a49eb-85ac-47e1-b098-557f6e625958","Type":"ContainerDied","Data":"3c5bcde2cba86d99ae6141a6b21ed516111c2d608d9387d93c3c7bd31f9c8dba"} Mar 19 09:37:12.706444 master-0 kubenswrapper[13205]: I0319 09:37:12.706407 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 19 09:37:12.707249 master-0 kubenswrapper[13205]: I0319 09:37:12.706411 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c5bcde2cba86d99ae6141a6b21ed516111c2d608d9387d93c3c7bd31f9c8dba" Mar 19 09:37:12.707844 master-0 kubenswrapper[13205]: I0319 09:37:12.707818 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-master-0" event={"ID":"7e546c2e-a198-4141-a1de-518b8d71d107","Type":"ContainerStarted","Data":"593c34e2f8c4f47957630bff7700d96b8ef85638b97f56e8e46adb46fe811184"} Mar 19 09:37:12.707966 master-0 kubenswrapper[13205]: I0319 09:37:12.707946 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-master-0" event={"ID":"7e546c2e-a198-4141-a1de-518b8d71d107","Type":"ContainerStarted","Data":"db43dc0a8eb67f01b58fc008cc443cc4e30f2215501d914fa629dcfbf94c8f3b"} Mar 19 09:37:12.733773 master-0 kubenswrapper[13205]: I0319 09:37:12.733698 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-7-master-0" podStartSLOduration=2.733682126 podStartE2EDuration="2.733682126s" podCreationTimestamp="2026-03-19 09:37:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:37:12.731783928 +0000 UTC m=+818.064090856" watchObservedRunningTime="2026-03-19 09:37:12.733682126 +0000 UTC m=+818.065989014" Mar 19 09:37:17.144500 master-0 kubenswrapper[13205]: I0319 09:37:17.144436 13205 patch_prober.go:28] interesting pod/console-79f67cdc89-bx72w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 19 09:37:17.144500 master-0 kubenswrapper[13205]: I0319 09:37:17.144492 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 19 09:37:17.438605 master-0 kubenswrapper[13205]: I0319 09:37:17.438464 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-7-master-0"] Mar 19 09:37:17.438810 master-0 kubenswrapper[13205]: E0319 09:37:17.438726 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26a49eb-85ac-47e1-b098-557f6e625958" containerName="pruner" Mar 19 09:37:17.438810 master-0 kubenswrapper[13205]: I0319 09:37:17.438740 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26a49eb-85ac-47e1-b098-557f6e625958" containerName="pruner" Mar 19 09:37:17.438929 master-0 kubenswrapper[13205]: I0319 09:37:17.438900 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26a49eb-85ac-47e1-b098-557f6e625958" containerName="pruner" Mar 19 09:37:17.439269 master-0 kubenswrapper[13205]: I0319 09:37:17.439249 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:17.442982 master-0 kubenswrapper[13205]: I0319 09:37:17.442908 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-qzwhq" Mar 19 09:37:17.443168 master-0 kubenswrapper[13205]: I0319 09:37:17.443091 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 09:37:17.465108 master-0 kubenswrapper[13205]: I0319 09:37:17.465029 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-7-master-0"] Mar 19 09:37:17.482989 master-0 kubenswrapper[13205]: I0319 09:37:17.482920 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52d83e52-4097-4c66-ad8b-bd524ff59c95-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"52d83e52-4097-4c66-ad8b-bd524ff59c95\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:17.483221 master-0 kubenswrapper[13205]: I0319 09:37:17.483033 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/52d83e52-4097-4c66-ad8b-bd524ff59c95-var-lock\") pod \"installer-7-master-0\" (UID: \"52d83e52-4097-4c66-ad8b-bd524ff59c95\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:17.483221 master-0 kubenswrapper[13205]: I0319 09:37:17.483098 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52d83e52-4097-4c66-ad8b-bd524ff59c95-kube-api-access\") pod \"installer-7-master-0\" (UID: \"52d83e52-4097-4c66-ad8b-bd524ff59c95\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:17.584500 master-0 kubenswrapper[13205]: I0319 09:37:17.584438 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52d83e52-4097-4c66-ad8b-bd524ff59c95-kube-api-access\") pod \"installer-7-master-0\" (UID: \"52d83e52-4097-4c66-ad8b-bd524ff59c95\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:17.584753 master-0 kubenswrapper[13205]: I0319 09:37:17.584568 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52d83e52-4097-4c66-ad8b-bd524ff59c95-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"52d83e52-4097-4c66-ad8b-bd524ff59c95\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:17.584753 master-0 kubenswrapper[13205]: I0319 09:37:17.584634 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/52d83e52-4097-4c66-ad8b-bd524ff59c95-var-lock\") pod \"installer-7-master-0\" (UID: \"52d83e52-4097-4c66-ad8b-bd524ff59c95\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:17.584857 master-0 kubenswrapper[13205]: I0319 09:37:17.584752 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52d83e52-4097-4c66-ad8b-bd524ff59c95-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"52d83e52-4097-4c66-ad8b-bd524ff59c95\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:17.584857 master-0 kubenswrapper[13205]: I0319 09:37:17.584818 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/52d83e52-4097-4c66-ad8b-bd524ff59c95-var-lock\") pod \"installer-7-master-0\" (UID: \"52d83e52-4097-4c66-ad8b-bd524ff59c95\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:17.611074 master-0 kubenswrapper[13205]: I0319 09:37:17.611002 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52d83e52-4097-4c66-ad8b-bd524ff59c95-kube-api-access\") pod \"installer-7-master-0\" (UID: \"52d83e52-4097-4c66-ad8b-bd524ff59c95\") " pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:17.764869 master-0 kubenswrapper[13205]: I0319 09:37:17.764779 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:18.215615 master-0 kubenswrapper[13205]: I0319 09:37:18.214397 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-7-master-0"] Mar 19 09:37:18.754810 master-0 kubenswrapper[13205]: I0319 09:37:18.754626 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-7-master-0" event={"ID":"52d83e52-4097-4c66-ad8b-bd524ff59c95","Type":"ContainerStarted","Data":"d87a4c5e1a8b8762a45b7129c8b874d4eeb705e1939872316e75c94012fbdbc4"} Mar 19 09:37:18.758874 master-0 kubenswrapper[13205]: I0319 09:37:18.758761 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-7-master-0" event={"ID":"52d83e52-4097-4c66-ad8b-bd524ff59c95","Type":"ContainerStarted","Data":"c3a2278d98bb9dc4cc015efa7b01530c696c897749750a39371c05bff7dcb12f"} Mar 19 09:37:18.781837 master-0 kubenswrapper[13205]: I0319 09:37:18.780172 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-7-master-0" podStartSLOduration=1.780154335 podStartE2EDuration="1.780154335s" podCreationTimestamp="2026-03-19 09:37:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:37:18.778149626 +0000 UTC m=+824.110456544" watchObservedRunningTime="2026-03-19 09:37:18.780154335 +0000 UTC m=+824.112461223" Mar 19 09:37:22.343693 master-0 kubenswrapper[13205]: I0319 09:37:22.343634 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc"] Mar 19 09:37:22.344795 master-0 kubenswrapper[13205]: I0319 09:37:22.344767 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.351669 master-0 kubenswrapper[13205]: I0319 09:37:22.351593 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-224sj" Mar 19 09:37:22.352127 master-0 kubenswrapper[13205]: I0319 09:37:22.352099 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:37:22.352278 master-0 kubenswrapper[13205]: I0319 09:37:22.352251 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:37:22.352662 master-0 kubenswrapper[13205]: I0319 09:37:22.352588 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:37:22.352801 master-0 kubenswrapper[13205]: I0319 09:37:22.352772 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:37:22.354147 master-0 kubenswrapper[13205]: I0319 09:37:22.354118 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n"] Mar 19 09:37:22.354642 master-0 kubenswrapper[13205]: I0319 09:37:22.354584 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:37:22.355055 master-0 kubenswrapper[13205]: I0319 09:37:22.355024 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.357032 master-0 kubenswrapper[13205]: I0319 09:37:22.356878 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:37:22.359326 master-0 kubenswrapper[13205]: I0319 09:37:22.358805 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:37:22.359326 master-0 kubenswrapper[13205]: I0319 09:37:22.359223 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:37:22.360982 master-0 kubenswrapper[13205]: I0319 09:37:22.360948 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:37:22.362554 master-0 kubenswrapper[13205]: I0319 09:37:22.362489 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:37:22.363013 master-0 kubenswrapper[13205]: I0319 09:37:22.362920 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:37:22.364620 master-0 kubenswrapper[13205]: I0319 09:37:22.364566 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-55975d94bc-hbbc2"] Mar 19 09:37:22.365702 master-0 kubenswrapper[13205]: I0319 09:37:22.365671 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.368630 master-0 kubenswrapper[13205]: I0319 09:37:22.368591 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 09:37:22.369696 master-0 kubenswrapper[13205]: I0319 09:37:22.369663 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 09:37:22.369906 master-0 kubenswrapper[13205]: I0319 09:37:22.369877 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:37:22.371442 master-0 kubenswrapper[13205]: I0319 09:37:22.371395 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 09:37:22.371442 master-0 kubenswrapper[13205]: I0319 09:37:22.371417 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 09:37:22.371587 master-0 kubenswrapper[13205]: I0319 09:37:22.371417 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 09:37:22.371705 master-0 kubenswrapper[13205]: I0319 09:37:22.371685 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 09:37:22.373040 master-0 kubenswrapper[13205]: I0319 09:37:22.373017 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.373632 master-0 kubenswrapper[13205]: I0319 09:37:22.373599 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 09:37:22.375119 master-0 kubenswrapper[13205]: I0319 09:37:22.375093 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 09:37:22.375281 master-0 kubenswrapper[13205]: I0319 09:37:22.375245 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 09:37:22.375509 master-0 kubenswrapper[13205]: I0319 09:37:22.375488 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-hnn89" Mar 19 09:37:22.375703 master-0 kubenswrapper[13205]: I0319 09:37:22.375683 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 09:37:22.375837 master-0 kubenswrapper[13205]: I0319 09:37:22.375808 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 19 09:37:22.375971 master-0 kubenswrapper[13205]: I0319 09:37:22.375938 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:37:22.376508 master-0 kubenswrapper[13205]: I0319 09:37:22.376482 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 09:37:22.377639 master-0 kubenswrapper[13205]: I0319 09:37:22.377616 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 19 09:37:22.378127 master-0 kubenswrapper[13205]: I0319 09:37:22.378098 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.378568 master-0 kubenswrapper[13205]: I0319 09:37:22.378517 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 19 09:37:22.379961 master-0 kubenswrapper[13205]: I0319 09:37:22.379898 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n"] Mar 19 09:37:22.380038 master-0 kubenswrapper[13205]: I0319 09:37:22.380023 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-hwfkt" Mar 19 09:37:22.385694 master-0 kubenswrapper[13205]: I0319 09:37:22.384949 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 19 09:37:22.389544 master-0 kubenswrapper[13205]: I0319 09:37:22.386123 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 19 09:37:22.389544 master-0 kubenswrapper[13205]: I0319 09:37:22.387489 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 19 09:37:22.389544 master-0 kubenswrapper[13205]: I0319 09:37:22.387592 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-be6kne0s3lnpg" Mar 19 09:37:22.397645 master-0 kubenswrapper[13205]: I0319 09:37:22.392589 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 19 09:37:22.397645 master-0 kubenswrapper[13205]: I0319 09:37:22.393496 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 09:37:22.397645 master-0 kubenswrapper[13205]: I0319 09:37:22.393872 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 09:37:22.397645 master-0 kubenswrapper[13205]: I0319 09:37:22.394015 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 19 09:37:22.397645 master-0 kubenswrapper[13205]: I0319 09:37:22.395602 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 09:37:22.397645 master-0 kubenswrapper[13205]: I0319 09:37:22.395936 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 09:37:22.397645 master-0 kubenswrapper[13205]: I0319 09:37:22.396667 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 09:37:22.397645 master-0 kubenswrapper[13205]: I0319 09:37:22.397315 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-252nv" Mar 19 09:37:22.397645 master-0 kubenswrapper[13205]: I0319 09:37:22.397465 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 09:37:22.404696 master-0 kubenswrapper[13205]: I0319 09:37:22.401707 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 19 09:37:22.408238 master-0 kubenswrapper[13205]: I0319 09:37:22.407963 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 09:37:22.408483 master-0 kubenswrapper[13205]: I0319 09:37:22.408439 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 09:37:22.410109 master-0 kubenswrapper[13205]: I0319 09:37:22.409988 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 19 09:37:22.412854 master-0 kubenswrapper[13205]: I0319 09:37:22.412568 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 09:37:22.413618 master-0 kubenswrapper[13205]: I0319 09:37:22.413581 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55975d94bc-hbbc2"] Mar 19 09:37:22.415879 master-0 kubenswrapper[13205]: I0319 09:37:22.415832 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 19 09:37:22.418317 master-0 kubenswrapper[13205]: I0319 09:37:22.417838 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 09:37:22.421466 master-0 kubenswrapper[13205]: I0319 09:37:22.421423 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:37:22.428280 master-0 kubenswrapper[13205]: I0319 09:37:22.428236 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc"] Mar 19 09:37:22.433677 master-0 kubenswrapper[13205]: I0319 09:37:22.433621 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:37:22.455857 master-0 kubenswrapper[13205]: I0319 09:37:22.455812 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mp9v\" (UniqueName: \"kubernetes.io/projected/b72ad58d-ee2c-4c99-8c1d-33ca6c48b582-kube-api-access-4mp9v\") pod \"route-controller-manager-6678cf68fb-dd79n\" (UID: \"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582\") " pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.455857 master-0 kubenswrapper[13205]: I0319 09:37:22.455870 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b72ad58d-ee2c-4c99-8c1d-33ca6c48b582-client-ca\") pod \"route-controller-manager-6678cf68fb-dd79n\" (UID: \"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582\") " pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.456079 master-0 kubenswrapper[13205]: I0319 09:37:22.455895 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvsbq\" (UniqueName: \"kubernetes.io/projected/289592ca-1a91-48ca-8913-94a8ed76685e-kube-api-access-kvsbq\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.456079 master-0 kubenswrapper[13205]: I0319 09:37:22.455915 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b72ad58d-ee2c-4c99-8c1d-33ca6c48b582-config\") pod \"route-controller-manager-6678cf68fb-dd79n\" (UID: \"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582\") " pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.456079 master-0 kubenswrapper[13205]: I0319 09:37:22.455936 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/289592ca-1a91-48ca-8913-94a8ed76685e-proxy-ca-bundles\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.456079 master-0 kubenswrapper[13205]: I0319 09:37:22.455950 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b72ad58d-ee2c-4c99-8c1d-33ca6c48b582-serving-cert\") pod \"route-controller-manager-6678cf68fb-dd79n\" (UID: \"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582\") " pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.456079 master-0 kubenswrapper[13205]: I0319 09:37:22.455976 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/289592ca-1a91-48ca-8913-94a8ed76685e-client-ca\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.456079 master-0 kubenswrapper[13205]: I0319 09:37:22.455991 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289592ca-1a91-48ca-8913-94a8ed76685e-serving-cert\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.456079 master-0 kubenswrapper[13205]: I0319 09:37:22.456010 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289592ca-1a91-48ca-8913-94a8ed76685e-config\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.557336 master-0 kubenswrapper[13205]: I0319 09:37:22.557275 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-web-config\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.557336 master-0 kubenswrapper[13205]: I0319 09:37:22.557334 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b72ad58d-ee2c-4c99-8c1d-33ca6c48b582-client-ca\") pod \"route-controller-manager-6678cf68fb-dd79n\" (UID: \"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582\") " pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.557336 master-0 kubenswrapper[13205]: I0319 09:37:22.557355 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557394 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557417 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557445 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557460 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-config-volume\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557475 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b6909618-647a-45d3-9027-3c3578992af1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557496 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557522 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvsbq\" (UniqueName: \"kubernetes.io/projected/289592ca-1a91-48ca-8913-94a8ed76685e-kube-api-access-kvsbq\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557573 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557594 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-service-ca\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557615 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b72ad58d-ee2c-4c99-8c1d-33ca6c48b582-config\") pod \"route-controller-manager-6678cf68fb-dd79n\" (UID: \"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582\") " pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557636 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557656 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/289592ca-1a91-48ca-8913-94a8ed76685e-proxy-ca-bundles\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557673 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-user-template-login\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557690 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b72ad58d-ee2c-4c99-8c1d-33ca6c48b582-serving-cert\") pod \"route-controller-manager-6678cf68fb-dd79n\" (UID: \"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582\") " pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557707 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557724 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b6909618-647a-45d3-9027-3c3578992af1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557743 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plqn8\" (UniqueName: \"kubernetes.io/projected/05446c48-303e-434f-9d9b-eec4f1f2b253-kube-api-access-plqn8\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557763 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557779 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557797 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/289592ca-1a91-48ca-8913-94a8ed76685e-client-ca\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557812 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-web-config\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557832 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289592ca-1a91-48ca-8913-94a8ed76685e-serving-cert\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557848 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b6909618-647a-45d3-9027-3c3578992af1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557866 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557883 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557898 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289592ca-1a91-48ca-8913-94a8ed76685e-config\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557914 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-config\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557930 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557945 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c07b2b70-f2f5-4575-add3-5fde88fc4848-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557963 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-user-template-error\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557980 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c07b2b70-f2f5-4575-add3-5fde88fc4848-config-out\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.557995 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.557953 master-0 kubenswrapper[13205]: I0319 09:37:22.558011 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05446c48-303e-434f-9d9b-eec4f1f2b253-audit-policies\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558031 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b6909618-647a-45d3-9027-3c3578992af1-config-out\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558047 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05446c48-303e-434f-9d9b-eec4f1f2b253-audit-dir\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558062 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgvtt\" (UniqueName: \"kubernetes.io/projected/b6909618-647a-45d3-9027-3c3578992af1-kube-api-access-xgvtt\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558078 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558100 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558117 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558135 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558154 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnx2z\" (UniqueName: \"kubernetes.io/projected/c07b2b70-f2f5-4575-add3-5fde88fc4848-kube-api-access-jnx2z\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558171 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558192 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6909618-647a-45d3-9027-3c3578992af1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558212 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c07b2b70-f2f5-4575-add3-5fde88fc4848-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558230 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mp9v\" (UniqueName: \"kubernetes.io/projected/b72ad58d-ee2c-4c99-8c1d-33ca6c48b582-kube-api-access-4mp9v\") pod \"route-controller-manager-6678cf68fb-dd79n\" (UID: \"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582\") " pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558246 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-session\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558269 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-router-certs\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558293 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558316 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.558565 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b72ad58d-ee2c-4c99-8c1d-33ca6c48b582-client-ca\") pod \"route-controller-manager-6678cf68fb-dd79n\" (UID: \"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582\") " pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.559551 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b72ad58d-ee2c-4c99-8c1d-33ca6c48b582-config\") pod \"route-controller-manager-6678cf68fb-dd79n\" (UID: \"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582\") " pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.560232 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/289592ca-1a91-48ca-8913-94a8ed76685e-client-ca\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.562975 master-0 kubenswrapper[13205]: I0319 09:37:22.560757 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/289592ca-1a91-48ca-8913-94a8ed76685e-proxy-ca-bundles\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.563997 master-0 kubenswrapper[13205]: I0319 09:37:22.563923 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/289592ca-1a91-48ca-8913-94a8ed76685e-config\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.572590 master-0 kubenswrapper[13205]: I0319 09:37:22.564788 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b72ad58d-ee2c-4c99-8c1d-33ca6c48b582-serving-cert\") pod \"route-controller-manager-6678cf68fb-dd79n\" (UID: \"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582\") " pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.572590 master-0 kubenswrapper[13205]: I0319 09:37:22.565278 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/289592ca-1a91-48ca-8913-94a8ed76685e-serving-cert\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.581820 master-0 kubenswrapper[13205]: I0319 09:37:22.581755 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mp9v\" (UniqueName: \"kubernetes.io/projected/b72ad58d-ee2c-4c99-8c1d-33ca6c48b582-kube-api-access-4mp9v\") pod \"route-controller-manager-6678cf68fb-dd79n\" (UID: \"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582\") " pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.585753 master-0 kubenswrapper[13205]: I0319 09:37:22.585677 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvsbq\" (UniqueName: \"kubernetes.io/projected/289592ca-1a91-48ca-8913-94a8ed76685e-kube-api-access-kvsbq\") pod \"controller-manager-7ccb4c8f4d-56pkc\" (UID: \"289592ca-1a91-48ca-8913-94a8ed76685e\") " pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.659946 master-0 kubenswrapper[13205]: I0319 09:37:22.659807 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.659946 master-0 kubenswrapper[13205]: I0319 09:37:22.659868 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.659946 master-0 kubenswrapper[13205]: I0319 09:37:22.659902 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.659946 master-0 kubenswrapper[13205]: I0319 09:37:22.659936 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-config-volume\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.660278 master-0 kubenswrapper[13205]: I0319 09:37:22.659959 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.660278 master-0 kubenswrapper[13205]: I0319 09:37:22.660190 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b6909618-647a-45d3-9027-3c3578992af1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.660278 master-0 kubenswrapper[13205]: I0319 09:37:22.660213 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.660936 master-0 kubenswrapper[13205]: I0319 09:37:22.660883 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.661021 master-0 kubenswrapper[13205]: I0319 09:37:22.660966 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-service-ca\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.661075 master-0 kubenswrapper[13205]: I0319 09:37:22.661016 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.661075 master-0 kubenswrapper[13205]: I0319 09:37:22.661068 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-user-template-login\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.661163 master-0 kubenswrapper[13205]: I0319 09:37:22.661094 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.661163 master-0 kubenswrapper[13205]: I0319 09:37:22.661135 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b6909618-647a-45d3-9027-3c3578992af1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.661246 master-0 kubenswrapper[13205]: I0319 09:37:22.661169 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plqn8\" (UniqueName: \"kubernetes.io/projected/05446c48-303e-434f-9d9b-eec4f1f2b253-kube-api-access-plqn8\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.661246 master-0 kubenswrapper[13205]: I0319 09:37:22.661205 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.661246 master-0 kubenswrapper[13205]: I0319 09:37:22.661229 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.661366 master-0 kubenswrapper[13205]: I0319 09:37:22.661265 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-web-config\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.661366 master-0 kubenswrapper[13205]: I0319 09:37:22.661289 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b6909618-647a-45d3-9027-3c3578992af1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.661366 master-0 kubenswrapper[13205]: I0319 09:37:22.661327 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.661366 master-0 kubenswrapper[13205]: I0319 09:37:22.661350 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.661547 master-0 kubenswrapper[13205]: I0319 09:37:22.661377 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-config\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.661547 master-0 kubenswrapper[13205]: I0319 09:37:22.661406 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.661547 master-0 kubenswrapper[13205]: I0319 09:37:22.661431 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c07b2b70-f2f5-4575-add3-5fde88fc4848-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.661547 master-0 kubenswrapper[13205]: I0319 09:37:22.661473 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05446c48-303e-434f-9d9b-eec4f1f2b253-audit-policies\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.661547 master-0 kubenswrapper[13205]: I0319 09:37:22.661497 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-user-template-error\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.661547 master-0 kubenswrapper[13205]: I0319 09:37:22.661519 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c07b2b70-f2f5-4575-add3-5fde88fc4848-config-out\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.661783 master-0 kubenswrapper[13205]: I0319 09:37:22.661565 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.661783 master-0 kubenswrapper[13205]: I0319 09:37:22.661590 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b6909618-647a-45d3-9027-3c3578992af1-config-out\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.661783 master-0 kubenswrapper[13205]: I0319 09:37:22.661617 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05446c48-303e-434f-9d9b-eec4f1f2b253-audit-dir\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.661783 master-0 kubenswrapper[13205]: I0319 09:37:22.661639 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgvtt\" (UniqueName: \"kubernetes.io/projected/b6909618-647a-45d3-9027-3c3578992af1-kube-api-access-xgvtt\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.661783 master-0 kubenswrapper[13205]: I0319 09:37:22.661665 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.661783 master-0 kubenswrapper[13205]: I0319 09:37:22.661696 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.661783 master-0 kubenswrapper[13205]: I0319 09:37:22.661726 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.661783 master-0 kubenswrapper[13205]: I0319 09:37:22.661751 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6909618-647a-45d3-9027-3c3578992af1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.661783 master-0 kubenswrapper[13205]: I0319 09:37:22.661771 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.662152 master-0 kubenswrapper[13205]: I0319 09:37:22.661794 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.662152 master-0 kubenswrapper[13205]: I0319 09:37:22.661814 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnx2z\" (UniqueName: \"kubernetes.io/projected/c07b2b70-f2f5-4575-add3-5fde88fc4848-kube-api-access-jnx2z\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.662152 master-0 kubenswrapper[13205]: I0319 09:37:22.661855 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-session\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.662152 master-0 kubenswrapper[13205]: I0319 09:37:22.661869 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.662152 master-0 kubenswrapper[13205]: I0319 09:37:22.661876 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c07b2b70-f2f5-4575-add3-5fde88fc4848-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.662152 master-0 kubenswrapper[13205]: I0319 09:37:22.661940 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-router-certs\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.662152 master-0 kubenswrapper[13205]: I0319 09:37:22.661974 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.662152 master-0 kubenswrapper[13205]: I0319 09:37:22.661997 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.662152 master-0 kubenswrapper[13205]: I0319 09:37:22.662032 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-web-config\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.662703 master-0 kubenswrapper[13205]: I0319 09:37:22.662672 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b6909618-647a-45d3-9027-3c3578992af1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.663021 master-0 kubenswrapper[13205]: I0319 09:37:22.662686 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05446c48-303e-434f-9d9b-eec4f1f2b253-audit-dir\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.663398 master-0 kubenswrapper[13205]: I0319 09:37:22.663363 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.663472 master-0 kubenswrapper[13205]: I0319 09:37:22.661616 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.664018 master-0 kubenswrapper[13205]: I0319 09:37:22.663984 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-service-ca\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.665393 master-0 kubenswrapper[13205]: I0319 09:37:22.665342 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-config-volume\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.665496 master-0 kubenswrapper[13205]: I0319 09:37:22.665361 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.665966 master-0 kubenswrapper[13205]: I0319 09:37:22.665928 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-web-config\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.666597 master-0 kubenswrapper[13205]: I0319 09:37:22.666561 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c07b2b70-f2f5-4575-add3-5fde88fc4848-config-out\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.666760 master-0 kubenswrapper[13205]: I0319 09:37:22.666732 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.667383 master-0 kubenswrapper[13205]: I0319 09:37:22.667354 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-router-certs\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.667478 master-0 kubenswrapper[13205]: I0319 09:37:22.667455 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.668008 master-0 kubenswrapper[13205]: I0319 09:37:22.667977 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05446c48-303e-434f-9d9b-eec4f1f2b253-audit-policies\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.668488 master-0 kubenswrapper[13205]: I0319 09:37:22.668440 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c07b2b70-f2f5-4575-add3-5fde88fc4848-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.669223 master-0 kubenswrapper[13205]: I0319 09:37:22.669126 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.669439 master-0 kubenswrapper[13205]: I0319 09:37:22.669404 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.669644 master-0 kubenswrapper[13205]: I0319 09:37:22.669589 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.669833 master-0 kubenswrapper[13205]: I0319 09:37:22.669801 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.669923 master-0 kubenswrapper[13205]: I0319 09:37:22.669865 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.670745 master-0 kubenswrapper[13205]: I0319 09:37:22.670690 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.671113 master-0 kubenswrapper[13205]: I0319 09:37:22.671062 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b6909618-647a-45d3-9027-3c3578992af1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.671253 master-0 kubenswrapper[13205]: I0319 09:37:22.671167 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.671495 master-0 kubenswrapper[13205]: I0319 09:37:22.671446 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.672375 master-0 kubenswrapper[13205]: I0319 09:37:22.672315 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b6909618-647a-45d3-9027-3c3578992af1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.672790 master-0 kubenswrapper[13205]: I0319 09:37:22.672741 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6909618-647a-45d3-9027-3c3578992af1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.677891 master-0 kubenswrapper[13205]: I0319 09:37:22.674631 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.677891 master-0 kubenswrapper[13205]: I0319 09:37:22.674846 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-user-template-error\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.677891 master-0 kubenswrapper[13205]: I0319 09:37:22.675115 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.677891 master-0 kubenswrapper[13205]: I0319 09:37:22.675460 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:22.678728 master-0 kubenswrapper[13205]: I0319 09:37:22.678516 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-session\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.678728 master-0 kubenswrapper[13205]: I0319 09:37:22.678611 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b6909618-647a-45d3-9027-3c3578992af1-web-config\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.679938 master-0 kubenswrapper[13205]: I0319 09:37:22.678890 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-config\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.679938 master-0 kubenswrapper[13205]: I0319 09:37:22.679247 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b6909618-647a-45d3-9027-3c3578992af1-config-out\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.679938 master-0 kubenswrapper[13205]: I0319 09:37:22.679514 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-user-template-login\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.680102 master-0 kubenswrapper[13205]: I0319 09:37:22.680040 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c07b2b70-f2f5-4575-add3-5fde88fc4848-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.684775 master-0 kubenswrapper[13205]: I0319 09:37:22.683245 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.684775 master-0 kubenswrapper[13205]: I0319 09:37:22.684168 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05446c48-303e-434f-9d9b-eec4f1f2b253-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.691832 master-0 kubenswrapper[13205]: I0319 09:37:22.691789 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgvtt\" (UniqueName: \"kubernetes.io/projected/b6909618-647a-45d3-9027-3c3578992af1-kube-api-access-xgvtt\") pod \"alertmanager-main-0\" (UID: \"b6909618-647a-45d3-9027-3c3578992af1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:22.692967 master-0 kubenswrapper[13205]: I0319 09:37:22.692919 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.693595 master-0 kubenswrapper[13205]: I0319 09:37:22.693557 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plqn8\" (UniqueName: \"kubernetes.io/projected/05446c48-303e-434f-9d9b-eec4f1f2b253-kube-api-access-plqn8\") pod \"oauth-openshift-55975d94bc-hbbc2\" (UID: \"05446c48-303e-434f-9d9b-eec4f1f2b253\") " pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.694156 master-0 kubenswrapper[13205]: I0319 09:37:22.694117 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c07b2b70-f2f5-4575-add3-5fde88fc4848-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.699678 master-0 kubenswrapper[13205]: I0319 09:37:22.699628 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:22.700922 master-0 kubenswrapper[13205]: I0319 09:37:22.700893 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnx2z\" (UniqueName: \"kubernetes.io/projected/c07b2b70-f2f5-4575-add3-5fde88fc4848-kube-api-access-jnx2z\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.701510 master-0 kubenswrapper[13205]: I0319 09:37:22.701473 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c07b2b70-f2f5-4575-add3-5fde88fc4848-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c07b2b70-f2f5-4575-add3-5fde88fc4848\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.721069 master-0 kubenswrapper[13205]: I0319 09:37:22.721014 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:22.737251 master-0 kubenswrapper[13205]: I0319 09:37:22.737189 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:22.754928 master-0 kubenswrapper[13205]: I0319 09:37:22.754878 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:37:23.112100 master-0 kubenswrapper[13205]: I0319 09:37:23.112069 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc"] Mar 19 09:37:23.115301 master-0 kubenswrapper[13205]: W0319 09:37:23.115251 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod289592ca_1a91_48ca_8913_94a8ed76685e.slice/crio-93a3c6edc224e0b7d443130215cb46415efd83678c8d9524c4d6a1e88c84ebe6 WatchSource:0}: Error finding container 93a3c6edc224e0b7d443130215cb46415efd83678c8d9524c4d6a1e88c84ebe6: Status 404 returned error can't find the container with id 93a3c6edc224e0b7d443130215cb46415efd83678c8d9524c4d6a1e88c84ebe6 Mar 19 09:37:23.214814 master-0 kubenswrapper[13205]: I0319 09:37:23.214758 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n"] Mar 19 09:37:23.218453 master-0 kubenswrapper[13205]: W0319 09:37:23.218406 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb72ad58d_ee2c_4c99_8c1d_33ca6c48b582.slice/crio-9a2bf17763c8e3b1fac2d022885fc5b7288571864a8f0a67453e4c24c80a647f WatchSource:0}: Error finding container 9a2bf17763c8e3b1fac2d022885fc5b7288571864a8f0a67453e4c24c80a647f: Status 404 returned error can't find the container with id 9a2bf17763c8e3b1fac2d022885fc5b7288571864a8f0a67453e4c24c80a647f Mar 19 09:37:23.293076 master-0 kubenswrapper[13205]: W0319 09:37:23.292765 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05446c48_303e_434f_9d9b_eec4f1f2b253.slice/crio-82dad27473a7bb2c152335369c409a1584a1b2b01be8c381ee232655db396120 WatchSource:0}: Error finding container 82dad27473a7bb2c152335369c409a1584a1b2b01be8c381ee232655db396120: Status 404 returned error can't find the container with id 82dad27473a7bb2c152335369c409a1584a1b2b01be8c381ee232655db396120 Mar 19 09:37:23.296636 master-0 kubenswrapper[13205]: I0319 09:37:23.295933 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55975d94bc-hbbc2"] Mar 19 09:37:23.301426 master-0 kubenswrapper[13205]: I0319 09:37:23.301383 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:37:23.317348 master-0 kubenswrapper[13205]: I0319 09:37:23.315665 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:37:23.793272 master-0 kubenswrapper[13205]: I0319 09:37:23.793201 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" event={"ID":"05446c48-303e-434f-9d9b-eec4f1f2b253","Type":"ContainerStarted","Data":"ac9cda3f384aab9bd90ee14f7b742d5397147dcd820057f906d28dcbabd3ba7d"} Mar 19 09:37:23.793272 master-0 kubenswrapper[13205]: I0319 09:37:23.793288 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" event={"ID":"05446c48-303e-434f-9d9b-eec4f1f2b253","Type":"ContainerStarted","Data":"82dad27473a7bb2c152335369c409a1584a1b2b01be8c381ee232655db396120"} Mar 19 09:37:23.793272 master-0 kubenswrapper[13205]: I0319 09:37:23.793314 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:23.795622 master-0 kubenswrapper[13205]: I0319 09:37:23.795428 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" event={"ID":"289592ca-1a91-48ca-8913-94a8ed76685e","Type":"ContainerStarted","Data":"61c5586c02cf97a3e5b6f347b7ec09de8f597c34013a0a6b298ae37822047f90"} Mar 19 09:37:23.795622 master-0 kubenswrapper[13205]: I0319 09:37:23.795472 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" event={"ID":"289592ca-1a91-48ca-8913-94a8ed76685e","Type":"ContainerStarted","Data":"93a3c6edc224e0b7d443130215cb46415efd83678c8d9524c4d6a1e88c84ebe6"} Mar 19 09:37:23.795777 master-0 kubenswrapper[13205]: I0319 09:37:23.795735 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:23.799143 master-0 kubenswrapper[13205]: I0319 09:37:23.797465 13205 generic.go:334] "Generic (PLEG): container finished" podID="c07b2b70-f2f5-4575-add3-5fde88fc4848" containerID="8012bb3d65437aef2a6064b9e8bb8459d203f0cd8339be1a58c6a025aacb93fa" exitCode=0 Mar 19 09:37:23.799143 master-0 kubenswrapper[13205]: I0319 09:37:23.797962 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07b2b70-f2f5-4575-add3-5fde88fc4848","Type":"ContainerDied","Data":"8012bb3d65437aef2a6064b9e8bb8459d203f0cd8339be1a58c6a025aacb93fa"} Mar 19 09:37:23.799143 master-0 kubenswrapper[13205]: I0319 09:37:23.798018 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07b2b70-f2f5-4575-add3-5fde88fc4848","Type":"ContainerStarted","Data":"b96204a5a452072e25b12475e6eee83ab350c5457ccb83dd53f5ce09f0927591"} Mar 19 09:37:23.800120 master-0 kubenswrapper[13205]: I0319 09:37:23.799820 13205 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:37:23.802969 master-0 kubenswrapper[13205]: I0319 09:37:23.802922 13205 generic.go:334] "Generic (PLEG): container finished" podID="b6909618-647a-45d3-9027-3c3578992af1" containerID="15200ade0ff8c9a84c92fed795ece394550d0112da23c09d742a9225e11b5d49" exitCode=0 Mar 19 09:37:23.803150 master-0 kubenswrapper[13205]: I0319 09:37:23.803066 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b6909618-647a-45d3-9027-3c3578992af1","Type":"ContainerDied","Data":"15200ade0ff8c9a84c92fed795ece394550d0112da23c09d742a9225e11b5d49"} Mar 19 09:37:23.803202 master-0 kubenswrapper[13205]: I0319 09:37:23.803164 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b6909618-647a-45d3-9027-3c3578992af1","Type":"ContainerStarted","Data":"76b8a5cb330fea9b2149ccd8122b989912961e06d6d541885a56aafae09dc487"} Mar 19 09:37:23.803861 master-0 kubenswrapper[13205]: I0319 09:37:23.803825 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" Mar 19 09:37:23.806424 master-0 kubenswrapper[13205]: I0319 09:37:23.806226 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" event={"ID":"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582","Type":"ContainerStarted","Data":"1d3385f0cf48c80ca1850d0fd0468f99acefb90af2089db94da88f01e14d8f2a"} Mar 19 09:37:23.806424 master-0 kubenswrapper[13205]: I0319 09:37:23.806348 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" event={"ID":"b72ad58d-ee2c-4c99-8c1d-33ca6c48b582","Type":"ContainerStarted","Data":"9a2bf17763c8e3b1fac2d022885fc5b7288571864a8f0a67453e4c24c80a647f"} Mar 19 09:37:23.806424 master-0 kubenswrapper[13205]: I0319 09:37:23.806392 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:23.832589 master-0 kubenswrapper[13205]: I0319 09:37:23.832428 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" podStartSLOduration=211.832391118 podStartE2EDuration="3m31.832391118s" podCreationTimestamp="2026-03-19 09:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:37:23.820820403 +0000 UTC m=+829.153127311" watchObservedRunningTime="2026-03-19 09:37:23.832391118 +0000 UTC m=+829.164698006" Mar 19 09:37:23.896927 master-0 kubenswrapper[13205]: I0319 09:37:23.894240 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" Mar 19 09:37:23.962787 master-0 kubenswrapper[13205]: I0319 09:37:23.962704 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7ccb4c8f4d-56pkc" podStartSLOduration=532.962676961 podStartE2EDuration="8m52.962676961s" podCreationTimestamp="2026-03-19 09:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:37:23.961716668 +0000 UTC m=+829.294023556" watchObservedRunningTime="2026-03-19 09:37:23.962676961 +0000 UTC m=+829.294983849" Mar 19 09:37:23.998662 master-0 kubenswrapper[13205]: I0319 09:37:23.998403 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6678cf68fb-dd79n" podStartSLOduration=211.998386109 podStartE2EDuration="3m31.998386109s" podCreationTimestamp="2026-03-19 09:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:37:23.994837762 +0000 UTC m=+829.327144650" watchObservedRunningTime="2026-03-19 09:37:23.998386109 +0000 UTC m=+829.330692997" Mar 19 09:37:24.033889 master-0 kubenswrapper[13205]: I0319 09:37:24.032208 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-55975d94bc-hbbc2" Mar 19 09:37:24.818736 master-0 kubenswrapper[13205]: I0319 09:37:24.818685 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b6909618-647a-45d3-9027-3c3578992af1","Type":"ContainerStarted","Data":"e6d4dd21c6f2d4e2d50aa6e5cdc1584a4d4b119a846f80b08703710ace43a397"} Mar 19 09:37:24.818736 master-0 kubenswrapper[13205]: I0319 09:37:24.818735 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b6909618-647a-45d3-9027-3c3578992af1","Type":"ContainerStarted","Data":"3eb95285ee66b1b6d8f0c92b1c77e68456f6382a0882c184ab8bf57f924b6895"} Mar 19 09:37:24.818736 master-0 kubenswrapper[13205]: I0319 09:37:24.818745 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b6909618-647a-45d3-9027-3c3578992af1","Type":"ContainerStarted","Data":"17803a8ee7e24ce5857161d9c21060fd9ccb53ad16c43e80394d1c2398734f35"} Mar 19 09:37:24.819255 master-0 kubenswrapper[13205]: I0319 09:37:24.818756 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b6909618-647a-45d3-9027-3c3578992af1","Type":"ContainerStarted","Data":"73fcbafa7754f5c56cda7e9087d73840ef528d6a915a4667264d1303481bea2b"} Mar 19 09:37:24.819255 master-0 kubenswrapper[13205]: I0319 09:37:24.818765 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b6909618-647a-45d3-9027-3c3578992af1","Type":"ContainerStarted","Data":"4d168c34bbb1062297d499473587a825b68129018425ea36f64bfd2d544bc58f"} Mar 19 09:37:25.116051 master-0 kubenswrapper[13205]: I0319 09:37:25.115966 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-6-master-0"] Mar 19 09:37:25.117186 master-0 kubenswrapper[13205]: I0319 09:37:25.117163 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:37:25.120033 master-0 kubenswrapper[13205]: I0319 09:37:25.119945 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-nm2j7" Mar 19 09:37:25.120241 master-0 kubenswrapper[13205]: I0319 09:37:25.119956 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 09:37:25.130732 master-0 kubenswrapper[13205]: I0319 09:37:25.130485 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-6-master-0"] Mar 19 09:37:25.137487 master-0 kubenswrapper[13205]: I0319 09:37:25.137377 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cc0cf46f-a311-4083-9187-8fb45c1106dd-var-lock\") pod \"installer-6-master-0\" (UID: \"cc0cf46f-a311-4083-9187-8fb45c1106dd\") " pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:37:25.137487 master-0 kubenswrapper[13205]: I0319 09:37:25.137484 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc0cf46f-a311-4083-9187-8fb45c1106dd-kube-api-access\") pod \"installer-6-master-0\" (UID: \"cc0cf46f-a311-4083-9187-8fb45c1106dd\") " pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:37:25.138167 master-0 kubenswrapper[13205]: I0319 09:37:25.138123 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc0cf46f-a311-4083-9187-8fb45c1106dd-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"cc0cf46f-a311-4083-9187-8fb45c1106dd\") " pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:37:25.241127 master-0 kubenswrapper[13205]: I0319 09:37:25.241057 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cc0cf46f-a311-4083-9187-8fb45c1106dd-var-lock\") pod \"installer-6-master-0\" (UID: \"cc0cf46f-a311-4083-9187-8fb45c1106dd\") " pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:37:25.241365 master-0 kubenswrapper[13205]: I0319 09:37:25.241145 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc0cf46f-a311-4083-9187-8fb45c1106dd-kube-api-access\") pod \"installer-6-master-0\" (UID: \"cc0cf46f-a311-4083-9187-8fb45c1106dd\") " pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:37:25.241365 master-0 kubenswrapper[13205]: I0319 09:37:25.241172 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cc0cf46f-a311-4083-9187-8fb45c1106dd-var-lock\") pod \"installer-6-master-0\" (UID: \"cc0cf46f-a311-4083-9187-8fb45c1106dd\") " pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:37:25.241365 master-0 kubenswrapper[13205]: I0319 09:37:25.241231 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc0cf46f-a311-4083-9187-8fb45c1106dd-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"cc0cf46f-a311-4083-9187-8fb45c1106dd\") " pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:37:25.241365 master-0 kubenswrapper[13205]: I0319 09:37:25.241346 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc0cf46f-a311-4083-9187-8fb45c1106dd-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"cc0cf46f-a311-4083-9187-8fb45c1106dd\") " pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:37:25.271087 master-0 kubenswrapper[13205]: I0319 09:37:25.271046 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc0cf46f-a311-4083-9187-8fb45c1106dd-kube-api-access\") pod \"installer-6-master-0\" (UID: \"cc0cf46f-a311-4083-9187-8fb45c1106dd\") " pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:37:25.455744 master-0 kubenswrapper[13205]: I0319 09:37:25.455622 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:37:25.850564 master-0 kubenswrapper[13205]: I0319 09:37:25.850403 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b6909618-647a-45d3-9027-3c3578992af1","Type":"ContainerStarted","Data":"fbb000f7da69f2a38a3ba7810f1938d2e9f47fbf36b0267930f28e53662b6344"} Mar 19 09:37:25.895341 master-0 kubenswrapper[13205]: I0319 09:37:25.895280 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-6-master-0"] Mar 19 09:37:25.903014 master-0 kubenswrapper[13205]: I0319 09:37:25.902925 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=211.902905177 podStartE2EDuration="3m31.902905177s" podCreationTimestamp="2026-03-19 09:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:37:25.887463057 +0000 UTC m=+831.219769945" watchObservedRunningTime="2026-03-19 09:37:25.902905177 +0000 UTC m=+831.235212055" Mar 19 09:37:27.114447 master-0 kubenswrapper[13205]: W0319 09:37:27.114371 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcc0cf46f_a311_4083_9187_8fb45c1106dd.slice/crio-39241e91aac9966a0b4e28caf7d560bf716f12dd0e19034be62408df8a6de801 WatchSource:0}: Error finding container 39241e91aac9966a0b4e28caf7d560bf716f12dd0e19034be62408df8a6de801: Status 404 returned error can't find the container with id 39241e91aac9966a0b4e28caf7d560bf716f12dd0e19034be62408df8a6de801 Mar 19 09:37:27.149253 master-0 kubenswrapper[13205]: I0319 09:37:27.149194 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:37:27.153424 master-0 kubenswrapper[13205]: I0319 09:37:27.153375 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:37:27.876103 master-0 kubenswrapper[13205]: I0319 09:37:27.876043 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07b2b70-f2f5-4575-add3-5fde88fc4848","Type":"ContainerStarted","Data":"0d4bac2296718b0558c455a085c2f96bc4fbd5256c83dbf612e0648d3a312c17"} Mar 19 09:37:27.876103 master-0 kubenswrapper[13205]: I0319 09:37:27.876097 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07b2b70-f2f5-4575-add3-5fde88fc4848","Type":"ContainerStarted","Data":"d8f816f4d4b2bb9d70de0a1e1e5663f36737748b841bd06e263af9fcb73b7605"} Mar 19 09:37:27.876103 master-0 kubenswrapper[13205]: I0319 09:37:27.876110 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07b2b70-f2f5-4575-add3-5fde88fc4848","Type":"ContainerStarted","Data":"6e052955d94bf47d842711441df3c3449fce16bae209278d40746b92266ca202"} Mar 19 09:37:27.876441 master-0 kubenswrapper[13205]: I0319 09:37:27.876121 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07b2b70-f2f5-4575-add3-5fde88fc4848","Type":"ContainerStarted","Data":"5da1157ca38c57d5000ca90707e31c7d88103535ecf9d0edf6c93941f292524f"} Mar 19 09:37:27.878652 master-0 kubenswrapper[13205]: I0319 09:37:27.878620 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-master-0" event={"ID":"cc0cf46f-a311-4083-9187-8fb45c1106dd","Type":"ContainerStarted","Data":"671fc7de0fc2a6da81db9d09f766c6106d20f1507b4db40868e6077ac02cb6ad"} Mar 19 09:37:27.878757 master-0 kubenswrapper[13205]: I0319 09:37:27.878656 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-master-0" event={"ID":"cc0cf46f-a311-4083-9187-8fb45c1106dd","Type":"ContainerStarted","Data":"39241e91aac9966a0b4e28caf7d560bf716f12dd0e19034be62408df8a6de801"} Mar 19 09:37:27.898065 master-0 kubenswrapper[13205]: I0319 09:37:27.897945 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-6-master-0" podStartSLOduration=2.89792822 podStartE2EDuration="2.89792822s" podCreationTimestamp="2026-03-19 09:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:37:27.896380012 +0000 UTC m=+833.228686900" watchObservedRunningTime="2026-03-19 09:37:27.89792822 +0000 UTC m=+833.230235108" Mar 19 09:37:28.889541 master-0 kubenswrapper[13205]: I0319 09:37:28.889469 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07b2b70-f2f5-4575-add3-5fde88fc4848","Type":"ContainerStarted","Data":"507644e5eb1a761a0967aca4662c08fab111ab79aaf9dd1b706196b2d6d07025"} Mar 19 09:37:28.890060 master-0 kubenswrapper[13205]: I0319 09:37:28.889559 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07b2b70-f2f5-4575-add3-5fde88fc4848","Type":"ContainerStarted","Data":"335f8c0d5e224b35cf2af4532a80bbecaeff08a44c0a910169e75b17c4fc1d19"} Mar 19 09:37:28.932072 master-0 kubenswrapper[13205]: I0319 09:37:28.930764 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=213.558719265 podStartE2EDuration="3m36.930743415s" podCreationTimestamp="2026-03-19 09:33:52 +0000 UTC" firstStartedPulling="2026-03-19 09:37:23.799745825 +0000 UTC m=+829.132052723" lastFinishedPulling="2026-03-19 09:37:27.171769975 +0000 UTC m=+832.504076873" observedRunningTime="2026-03-19 09:37:28.925098906 +0000 UTC m=+834.257405814" watchObservedRunningTime="2026-03-19 09:37:28.930743415 +0000 UTC m=+834.263050313" Mar 19 09:37:32.737939 master-0 kubenswrapper[13205]: I0319 09:37:32.737886 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:37:35.209255 master-0 kubenswrapper[13205]: E0319 09:37:35.209174 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:37:43.049062 master-0 kubenswrapper[13205]: I0319 09:37:43.048974 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:37:43.050108 master-0 kubenswrapper[13205]: I0319 09:37:43.049259 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="11a2f93448b9d54da9854663936e2b73" containerName="kube-scheduler" containerID="cri-o://372cd682ac3c0ea2bb18f78daead6727ce073fa9a81ef16d8eb3a25f2f9a5913" gracePeriod=30 Mar 19 09:37:43.050108 master-0 kubenswrapper[13205]: I0319 09:37:43.049302 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="11a2f93448b9d54da9854663936e2b73" containerName="kube-scheduler-recovery-controller" containerID="cri-o://d85c161426a1b0175ad90a172cca4e4d8843322ec3d411bcca9fccf3bb07ad91" gracePeriod=30 Mar 19 09:37:43.050108 master-0 kubenswrapper[13205]: I0319 09:37:43.049381 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="11a2f93448b9d54da9854663936e2b73" containerName="kube-scheduler-cert-syncer" containerID="cri-o://4ffb489960ae764e7d490cc5a515222762442a3abaf010fa816c5f4dbae9dc07" gracePeriod=30 Mar 19 09:37:43.050108 master-0 kubenswrapper[13205]: I0319 09:37:43.049485 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:37:43.050108 master-0 kubenswrapper[13205]: E0319 09:37:43.049976 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a2f93448b9d54da9854663936e2b73" containerName="wait-for-host-port" Mar 19 09:37:43.050108 master-0 kubenswrapper[13205]: I0319 09:37:43.049999 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a2f93448b9d54da9854663936e2b73" containerName="wait-for-host-port" Mar 19 09:37:43.050108 master-0 kubenswrapper[13205]: E0319 09:37:43.050039 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a2f93448b9d54da9854663936e2b73" containerName="kube-scheduler-cert-syncer" Mar 19 09:37:43.050108 master-0 kubenswrapper[13205]: I0319 09:37:43.050054 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a2f93448b9d54da9854663936e2b73" containerName="kube-scheduler-cert-syncer" Mar 19 09:37:43.050108 master-0 kubenswrapper[13205]: E0319 09:37:43.050087 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a2f93448b9d54da9854663936e2b73" containerName="kube-scheduler" Mar 19 09:37:43.050108 master-0 kubenswrapper[13205]: I0319 09:37:43.050101 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a2f93448b9d54da9854663936e2b73" containerName="kube-scheduler" Mar 19 09:37:43.050108 master-0 kubenswrapper[13205]: E0319 09:37:43.050126 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a2f93448b9d54da9854663936e2b73" containerName="kube-scheduler-recovery-controller" Mar 19 09:37:43.051942 master-0 kubenswrapper[13205]: I0319 09:37:43.050141 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a2f93448b9d54da9854663936e2b73" containerName="kube-scheduler-recovery-controller" Mar 19 09:37:43.051942 master-0 kubenswrapper[13205]: I0319 09:37:43.050371 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a2f93448b9d54da9854663936e2b73" containerName="kube-scheduler-cert-syncer" Mar 19 09:37:43.051942 master-0 kubenswrapper[13205]: I0319 09:37:43.050414 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a2f93448b9d54da9854663936e2b73" containerName="kube-scheduler-recovery-controller" Mar 19 09:37:43.051942 master-0 kubenswrapper[13205]: I0319 09:37:43.050436 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a2f93448b9d54da9854663936e2b73" containerName="kube-scheduler" Mar 19 09:37:43.182636 master-0 kubenswrapper[13205]: I0319 09:37:43.182564 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/32c74216166e87f3b80af3f77a8bf69d-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"32c74216166e87f3b80af3f77a8bf69d\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:43.182782 master-0 kubenswrapper[13205]: I0319 09:37:43.182744 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/32c74216166e87f3b80af3f77a8bf69d-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"32c74216166e87f3b80af3f77a8bf69d\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:43.208150 master-0 kubenswrapper[13205]: I0319 09:37:43.208100 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_11a2f93448b9d54da9854663936e2b73/kube-scheduler-cert-syncer/0.log" Mar 19 09:37:43.208923 master-0 kubenswrapper[13205]: I0319 09:37:43.208901 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:43.212676 master-0 kubenswrapper[13205]: I0319 09:37:43.212613 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="11a2f93448b9d54da9854663936e2b73" podUID="32c74216166e87f3b80af3f77a8bf69d" Mar 19 09:37:43.284662 master-0 kubenswrapper[13205]: I0319 09:37:43.284604 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/11a2f93448b9d54da9854663936e2b73-resource-dir\") pod \"11a2f93448b9d54da9854663936e2b73\" (UID: \"11a2f93448b9d54da9854663936e2b73\") " Mar 19 09:37:43.284788 master-0 kubenswrapper[13205]: I0319 09:37:43.284734 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11a2f93448b9d54da9854663936e2b73-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "11a2f93448b9d54da9854663936e2b73" (UID: "11a2f93448b9d54da9854663936e2b73"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:43.284788 master-0 kubenswrapper[13205]: I0319 09:37:43.284775 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/11a2f93448b9d54da9854663936e2b73-cert-dir\") pod \"11a2f93448b9d54da9854663936e2b73\" (UID: \"11a2f93448b9d54da9854663936e2b73\") " Mar 19 09:37:43.284874 master-0 kubenswrapper[13205]: I0319 09:37:43.284829 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11a2f93448b9d54da9854663936e2b73-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "11a2f93448b9d54da9854663936e2b73" (UID: "11a2f93448b9d54da9854663936e2b73"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:43.285033 master-0 kubenswrapper[13205]: I0319 09:37:43.284999 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/32c74216166e87f3b80af3f77a8bf69d-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"32c74216166e87f3b80af3f77a8bf69d\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:43.285130 master-0 kubenswrapper[13205]: I0319 09:37:43.285100 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/32c74216166e87f3b80af3f77a8bf69d-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"32c74216166e87f3b80af3f77a8bf69d\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:43.285166 master-0 kubenswrapper[13205]: I0319 09:37:43.285148 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/32c74216166e87f3b80af3f77a8bf69d-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"32c74216166e87f3b80af3f77a8bf69d\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:43.285255 master-0 kubenswrapper[13205]: I0319 09:37:43.285234 13205 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/11a2f93448b9d54da9854663936e2b73-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:43.285304 master-0 kubenswrapper[13205]: I0319 09:37:43.285257 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/11a2f93448b9d54da9854663936e2b73-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:43.285304 master-0 kubenswrapper[13205]: I0319 09:37:43.285286 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/32c74216166e87f3b80af3f77a8bf69d-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"32c74216166e87f3b80af3f77a8bf69d\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:44.018669 master-0 kubenswrapper[13205]: I0319 09:37:44.018603 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_11a2f93448b9d54da9854663936e2b73/kube-scheduler-cert-syncer/0.log" Mar 19 09:37:44.019715 master-0 kubenswrapper[13205]: I0319 09:37:44.019673 13205 generic.go:334] "Generic (PLEG): container finished" podID="11a2f93448b9d54da9854663936e2b73" containerID="d85c161426a1b0175ad90a172cca4e4d8843322ec3d411bcca9fccf3bb07ad91" exitCode=0 Mar 19 09:37:44.019715 master-0 kubenswrapper[13205]: I0319 09:37:44.019706 13205 generic.go:334] "Generic (PLEG): container finished" podID="11a2f93448b9d54da9854663936e2b73" containerID="4ffb489960ae764e7d490cc5a515222762442a3abaf010fa816c5f4dbae9dc07" exitCode=2 Mar 19 09:37:44.019715 master-0 kubenswrapper[13205]: I0319 09:37:44.019715 13205 generic.go:334] "Generic (PLEG): container finished" podID="11a2f93448b9d54da9854663936e2b73" containerID="372cd682ac3c0ea2bb18f78daead6727ce073fa9a81ef16d8eb3a25f2f9a5913" exitCode=0 Mar 19 09:37:44.019848 master-0 kubenswrapper[13205]: I0319 09:37:44.019782 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7db13cbdfa075d7e1ee54bcbb0372b833ba22b656ab4ab77d01a35b4188a2e3b" Mar 19 09:37:44.019848 master-0 kubenswrapper[13205]: I0319 09:37:44.019813 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:44.022167 master-0 kubenswrapper[13205]: I0319 09:37:44.022121 13205 generic.go:334] "Generic (PLEG): container finished" podID="7e546c2e-a198-4141-a1de-518b8d71d107" containerID="593c34e2f8c4f47957630bff7700d96b8ef85638b97f56e8e46adb46fe811184" exitCode=0 Mar 19 09:37:44.022167 master-0 kubenswrapper[13205]: I0319 09:37:44.022159 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-master-0" event={"ID":"7e546c2e-a198-4141-a1de-518b8d71d107","Type":"ContainerDied","Data":"593c34e2f8c4f47957630bff7700d96b8ef85638b97f56e8e46adb46fe811184"} Mar 19 09:37:44.026073 master-0 kubenswrapper[13205]: I0319 09:37:44.026022 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="11a2f93448b9d54da9854663936e2b73" podUID="32c74216166e87f3b80af3f77a8bf69d" Mar 19 09:37:44.057170 master-0 kubenswrapper[13205]: I0319 09:37:44.057099 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="11a2f93448b9d54da9854663936e2b73" podUID="32c74216166e87f3b80af3f77a8bf69d" Mar 19 09:37:44.893373 master-0 kubenswrapper[13205]: I0319 09:37:44.863440 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a2f93448b9d54da9854663936e2b73" path="/var/lib/kubelet/pods/11a2f93448b9d54da9854663936e2b73/volumes" Mar 19 09:37:45.384807 master-0 kubenswrapper[13205]: I0319 09:37:45.384770 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:45.519558 master-0 kubenswrapper[13205]: I0319 09:37:45.519455 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e546c2e-a198-4141-a1de-518b8d71d107-var-lock\") pod \"7e546c2e-a198-4141-a1de-518b8d71d107\" (UID: \"7e546c2e-a198-4141-a1de-518b8d71d107\") " Mar 19 09:37:45.519984 master-0 kubenswrapper[13205]: I0319 09:37:45.519599 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e546c2e-a198-4141-a1de-518b8d71d107-var-lock" (OuterVolumeSpecName: "var-lock") pod "7e546c2e-a198-4141-a1de-518b8d71d107" (UID: "7e546c2e-a198-4141-a1de-518b8d71d107"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:45.519984 master-0 kubenswrapper[13205]: I0319 09:37:45.519831 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e546c2e-a198-4141-a1de-518b8d71d107-kube-api-access\") pod \"7e546c2e-a198-4141-a1de-518b8d71d107\" (UID: \"7e546c2e-a198-4141-a1de-518b8d71d107\") " Mar 19 09:37:45.519984 master-0 kubenswrapper[13205]: I0319 09:37:45.519895 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e546c2e-a198-4141-a1de-518b8d71d107-kubelet-dir\") pod \"7e546c2e-a198-4141-a1de-518b8d71d107\" (UID: \"7e546c2e-a198-4141-a1de-518b8d71d107\") " Mar 19 09:37:45.520373 master-0 kubenswrapper[13205]: I0319 09:37:45.520123 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e546c2e-a198-4141-a1de-518b8d71d107-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7e546c2e-a198-4141-a1de-518b8d71d107" (UID: "7e546c2e-a198-4141-a1de-518b8d71d107"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:45.520747 master-0 kubenswrapper[13205]: I0319 09:37:45.520705 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e546c2e-a198-4141-a1de-518b8d71d107-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:45.520747 master-0 kubenswrapper[13205]: I0319 09:37:45.520734 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e546c2e-a198-4141-a1de-518b8d71d107-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:45.524099 master-0 kubenswrapper[13205]: I0319 09:37:45.524040 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e546c2e-a198-4141-a1de-518b8d71d107-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7e546c2e-a198-4141-a1de-518b8d71d107" (UID: "7e546c2e-a198-4141-a1de-518b8d71d107"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:37:45.621779 master-0 kubenswrapper[13205]: I0319 09:37:45.621668 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e546c2e-a198-4141-a1de-518b8d71d107-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:46.044124 master-0 kubenswrapper[13205]: I0319 09:37:46.044023 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-master-0" event={"ID":"7e546c2e-a198-4141-a1de-518b8d71d107","Type":"ContainerDied","Data":"db43dc0a8eb67f01b58fc008cc443cc4e30f2215501d914fa629dcfbf94c8f3b"} Mar 19 09:37:46.044124 master-0 kubenswrapper[13205]: I0319 09:37:46.044117 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db43dc0a8eb67f01b58fc008cc443cc4e30f2215501d914fa629dcfbf94c8f3b" Mar 19 09:37:46.044597 master-0 kubenswrapper[13205]: I0319 09:37:46.044515 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-master-0" Mar 19 09:37:50.870520 master-0 kubenswrapper[13205]: I0319 09:37:50.870267 13205 scope.go:117] "RemoveContainer" containerID="0a17e7848d06038a69e2540781de2a324d8067bd69c0598df08e190c706b5066" Mar 19 09:37:56.423231 master-0 kubenswrapper[13205]: I0319 09:37:56.423043 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:37:56.424082 master-0 kubenswrapper[13205]: E0319 09:37:56.423898 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e546c2e-a198-4141-a1de-518b8d71d107" containerName="installer" Mar 19 09:37:56.424082 master-0 kubenswrapper[13205]: I0319 09:37:56.423924 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e546c2e-a198-4141-a1de-518b8d71d107" containerName="installer" Mar 19 09:37:56.424264 master-0 kubenswrapper[13205]: I0319 09:37:56.424200 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e546c2e-a198-4141-a1de-518b8d71d107" containerName="installer" Mar 19 09:37:56.428024 master-0 kubenswrapper[13205]: I0319 09:37:56.427976 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:37:56.428259 master-0 kubenswrapper[13205]: I0319 09:37:56.428244 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:37:56.428408 master-0 kubenswrapper[13205]: I0319 09:37:56.428109 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.428586 master-0 kubenswrapper[13205]: I0319 09:37:56.428518 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver" containerID="cri-o://9fdb611902d12b2a92243866d2f2faa5150f2625e0d4e605ba1a9f99f66b1118" gracePeriod=15 Mar 19 09:37:56.428675 master-0 kubenswrapper[13205]: I0319 09:37:56.428604 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-check-endpoints" containerID="cri-o://076538bd4c7100a0612055eb8146732b5393bb4a40cba04223f3c3e0ec6d4f97" gracePeriod=15 Mar 19 09:37:56.428728 master-0 kubenswrapper[13205]: I0319 09:37:56.428674 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://622c0abe71e7d2d01e416c3b3d9b79cd36de8ab264a68375ae57bedde186d74f" gracePeriod=15 Mar 19 09:37:56.428810 master-0 kubenswrapper[13205]: I0319 09:37:56.428690 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://dda2eeb89eba23a9330d36c63c58647d4e1a8606023fc7e613231cdbeef9e880" gracePeriod=15 Mar 19 09:37:56.428863 master-0 kubenswrapper[13205]: I0319 09:37:56.428674 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f44908c788b64295105837947fd6893b0992d0ed4948253b4f5b18e1b7fa0acb" gracePeriod=15 Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: E0319 09:37:56.429269 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-check-endpoints" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: I0319 09:37:56.429288 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-check-endpoints" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: E0319 09:37:56.429319 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-cert-syncer" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: I0319 09:37:56.429326 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-cert-syncer" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: E0319 09:37:56.429352 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-insecure-readyz" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: I0319 09:37:56.429358 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-insecure-readyz" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: E0319 09:37:56.429382 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: I0319 09:37:56.429389 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: E0319 09:37:56.429408 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: I0319 09:37:56.429416 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: E0319 09:37:56.429426 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="setup" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: I0319 09:37:56.429432 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="setup" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: I0319 09:37:56.429596 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-insecure-readyz" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: I0319 09:37:56.429613 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-check-endpoints" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: I0319 09:37:56.429633 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: I0319 09:37:56.429650 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver-cert-syncer" Mar 19 09:37:56.429699 master-0 kubenswrapper[13205]: I0319 09:37:56.429664 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="274c4bebf95a655851b2cf276fe43ef7" containerName="kube-apiserver" Mar 19 09:37:56.445580 master-0 kubenswrapper[13205]: I0319 09:37:56.443368 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="274c4bebf95a655851b2cf276fe43ef7" podUID="3cae843f2a8e3c3c3212b1177305c1d5" Mar 19 09:37:56.504897 master-0 kubenswrapper[13205]: I0319 09:37:56.504815 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.504897 master-0 kubenswrapper[13205]: I0319 09:37:56.504891 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.504897 master-0 kubenswrapper[13205]: I0319 09:37:56.504926 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.505228 master-0 kubenswrapper[13205]: I0319 09:37:56.504946 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.505228 master-0 kubenswrapper[13205]: I0319 09:37:56.504979 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:56.505228 master-0 kubenswrapper[13205]: I0319 09:37:56.504995 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:56.505228 master-0 kubenswrapper[13205]: I0319 09:37:56.505016 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:56.505228 master-0 kubenswrapper[13205]: I0319 09:37:56.505045 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.550708 master-0 kubenswrapper[13205]: E0319 09:37:56.550629 13205 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.606654 master-0 kubenswrapper[13205]: I0319 09:37:56.606579 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.606654 master-0 kubenswrapper[13205]: I0319 09:37:56.606658 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.606883 master-0 kubenswrapper[13205]: I0319 09:37:56.606734 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.606883 master-0 kubenswrapper[13205]: I0319 09:37:56.606785 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.606883 master-0 kubenswrapper[13205]: I0319 09:37:56.606821 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:56.606883 master-0 kubenswrapper[13205]: I0319 09:37:56.606840 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:56.606883 master-0 kubenswrapper[13205]: I0319 09:37:56.606859 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:56.606883 master-0 kubenswrapper[13205]: I0319 09:37:56.606890 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.607417 master-0 kubenswrapper[13205]: I0319 09:37:56.606925 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.607417 master-0 kubenswrapper[13205]: I0319 09:37:56.606981 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.607417 master-0 kubenswrapper[13205]: I0319 09:37:56.607002 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.607417 master-0 kubenswrapper[13205]: I0319 09:37:56.607021 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.607417 master-0 kubenswrapper[13205]: I0319 09:37:56.607042 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:56.607417 master-0 kubenswrapper[13205]: I0319 09:37:56.607064 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:56.607417 master-0 kubenswrapper[13205]: I0319 09:37:56.607081 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3cae843f2a8e3c3c3212b1177305c1d5-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"3cae843f2a8e3c3c3212b1177305c1d5\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:56.607417 master-0 kubenswrapper[13205]: I0319 09:37:56.607101 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.849099 master-0 kubenswrapper[13205]: I0319 09:37:56.849047 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:56.864994 master-0 kubenswrapper[13205]: I0319 09:37:56.864043 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:56.886131 master-0 kubenswrapper[13205]: W0319 09:37:56.886082 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a4744531cb137d7252790be662d8cc8.slice/crio-7d0e8b247c13337a4901096cd0afb92ac97537f7993aac99bd4b74feb6520213 WatchSource:0}: Error finding container 7d0e8b247c13337a4901096cd0afb92ac97537f7993aac99bd4b74feb6520213: Status 404 returned error can't find the container with id 7d0e8b247c13337a4901096cd0afb92ac97537f7993aac99bd4b74feb6520213 Mar 19 09:37:56.889485 master-0 kubenswrapper[13205]: E0319 09:37:56.889358 13205 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e348b7a5c0286 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:7a4744531cb137d7252790be662d8cc8,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:37:56.888502918 +0000 UTC m=+862.220809826,LastTimestamp:2026-03-19 09:37:56.888502918 +0000 UTC m=+862.220809826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:37:56.903713 master-0 kubenswrapper[13205]: I0319 09:37:56.903669 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:37:56.903713 master-0 kubenswrapper[13205]: I0319 09:37:56.903710 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:37:56.905074 master-0 kubenswrapper[13205]: E0319 09:37:56.904987 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:56.905802 master-0 kubenswrapper[13205]: I0319 09:37:56.905760 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:56.950160 master-0 kubenswrapper[13205]: W0319 09:37:56.950088 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c74216166e87f3b80af3f77a8bf69d.slice/crio-8e7fdcaebef9626ca912398fdc0754412cc85a7dedee29b519e624f38e3d043a WatchSource:0}: Error finding container 8e7fdcaebef9626ca912398fdc0754412cc85a7dedee29b519e624f38e3d043a: Status 404 returned error can't find the container with id 8e7fdcaebef9626ca912398fdc0754412cc85a7dedee29b519e624f38e3d043a Mar 19 09:37:57.157082 master-0 kubenswrapper[13205]: I0319 09:37:57.157045 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_274c4bebf95a655851b2cf276fe43ef7/kube-apiserver-cert-syncer/0.log" Mar 19 09:37:57.158155 master-0 kubenswrapper[13205]: I0319 09:37:57.158117 13205 generic.go:334] "Generic (PLEG): container finished" podID="274c4bebf95a655851b2cf276fe43ef7" containerID="076538bd4c7100a0612055eb8146732b5393bb4a40cba04223f3c3e0ec6d4f97" exitCode=0 Mar 19 09:37:57.158221 master-0 kubenswrapper[13205]: I0319 09:37:57.158160 13205 generic.go:334] "Generic (PLEG): container finished" podID="274c4bebf95a655851b2cf276fe43ef7" containerID="622c0abe71e7d2d01e416c3b3d9b79cd36de8ab264a68375ae57bedde186d74f" exitCode=0 Mar 19 09:37:57.158221 master-0 kubenswrapper[13205]: I0319 09:37:57.158178 13205 generic.go:334] "Generic (PLEG): container finished" podID="274c4bebf95a655851b2cf276fe43ef7" containerID="dda2eeb89eba23a9330d36c63c58647d4e1a8606023fc7e613231cdbeef9e880" exitCode=0 Mar 19 09:37:57.158221 master-0 kubenswrapper[13205]: I0319 09:37:57.158190 13205 generic.go:334] "Generic (PLEG): container finished" podID="274c4bebf95a655851b2cf276fe43ef7" containerID="f44908c788b64295105837947fd6893b0992d0ed4948253b4f5b18e1b7fa0acb" exitCode=2 Mar 19 09:37:57.160366 master-0 kubenswrapper[13205]: I0319 09:37:57.160328 13205 generic.go:334] "Generic (PLEG): container finished" podID="52d83e52-4097-4c66-ad8b-bd524ff59c95" containerID="d87a4c5e1a8b8762a45b7129c8b874d4eeb705e1939872316e75c94012fbdbc4" exitCode=0 Mar 19 09:37:57.160443 master-0 kubenswrapper[13205]: I0319 09:37:57.160398 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-7-master-0" event={"ID":"52d83e52-4097-4c66-ad8b-bd524ff59c95","Type":"ContainerDied","Data":"d87a4c5e1a8b8762a45b7129c8b874d4eeb705e1939872316e75c94012fbdbc4"} Mar 19 09:37:57.161619 master-0 kubenswrapper[13205]: I0319 09:37:57.161574 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:57.162251 master-0 kubenswrapper[13205]: I0319 09:37:57.162209 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"32c74216166e87f3b80af3f77a8bf69d","Type":"ContainerStarted","Data":"8e7fdcaebef9626ca912398fdc0754412cc85a7dedee29b519e624f38e3d043a"} Mar 19 09:37:57.163936 master-0 kubenswrapper[13205]: I0319 09:37:57.163889 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"7a4744531cb137d7252790be662d8cc8","Type":"ContainerStarted","Data":"7d0e8b247c13337a4901096cd0afb92ac97537f7993aac99bd4b74feb6520213"} Mar 19 09:37:58.191931 master-0 kubenswrapper[13205]: I0319 09:37:58.191861 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"32c74216166e87f3b80af3f77a8bf69d","Type":"ContainerStarted","Data":"206863945d4c792c9f5ef1d537facfbfd6b9d7b3e40d27498e2adf8234947571"} Mar 19 09:37:58.194184 master-0 kubenswrapper[13205]: I0319 09:37:58.192276 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:37:58.194184 master-0 kubenswrapper[13205]: I0319 09:37:58.192325 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:37:58.194184 master-0 kubenswrapper[13205]: I0319 09:37:58.193177 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:58.194184 master-0 kubenswrapper[13205]: E0319 09:37:58.193277 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:58.196798 master-0 kubenswrapper[13205]: I0319 09:37:58.196727 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"7a4744531cb137d7252790be662d8cc8","Type":"ContainerStarted","Data":"4480bed3b7a3693f08ea10613c53fbf0f22b7134b5359f687dc90373884ff91b"} Mar 19 09:37:58.200681 master-0 kubenswrapper[13205]: E0319 09:37:58.198283 13205 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:58.200681 master-0 kubenswrapper[13205]: I0319 09:37:58.198488 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:58.644309 master-0 kubenswrapper[13205]: I0319 09:37:58.644262 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:58.645695 master-0 kubenswrapper[13205]: I0319 09:37:58.645656 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:58.742554 master-0 kubenswrapper[13205]: I0319 09:37:58.742491 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52d83e52-4097-4c66-ad8b-bd524ff59c95-kube-api-access\") pod \"52d83e52-4097-4c66-ad8b-bd524ff59c95\" (UID: \"52d83e52-4097-4c66-ad8b-bd524ff59c95\") " Mar 19 09:37:58.742936 master-0 kubenswrapper[13205]: I0319 09:37:58.742911 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52d83e52-4097-4c66-ad8b-bd524ff59c95-kubelet-dir\") pod \"52d83e52-4097-4c66-ad8b-bd524ff59c95\" (UID: \"52d83e52-4097-4c66-ad8b-bd524ff59c95\") " Mar 19 09:37:58.742986 master-0 kubenswrapper[13205]: I0319 09:37:58.742951 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/52d83e52-4097-4c66-ad8b-bd524ff59c95-var-lock\") pod \"52d83e52-4097-4c66-ad8b-bd524ff59c95\" (UID: \"52d83e52-4097-4c66-ad8b-bd524ff59c95\") " Mar 19 09:37:58.743016 master-0 kubenswrapper[13205]: I0319 09:37:58.742968 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d83e52-4097-4c66-ad8b-bd524ff59c95-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "52d83e52-4097-4c66-ad8b-bd524ff59c95" (UID: "52d83e52-4097-4c66-ad8b-bd524ff59c95"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:58.743125 master-0 kubenswrapper[13205]: I0319 09:37:58.743097 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d83e52-4097-4c66-ad8b-bd524ff59c95-var-lock" (OuterVolumeSpecName: "var-lock") pod "52d83e52-4097-4c66-ad8b-bd524ff59c95" (UID: "52d83e52-4097-4c66-ad8b-bd524ff59c95"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:58.743304 master-0 kubenswrapper[13205]: I0319 09:37:58.743281 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52d83e52-4097-4c66-ad8b-bd524ff59c95-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:58.743304 master-0 kubenswrapper[13205]: I0319 09:37:58.743301 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/52d83e52-4097-4c66-ad8b-bd524ff59c95-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:58.744850 master-0 kubenswrapper[13205]: I0319 09:37:58.744798 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d83e52-4097-4c66-ad8b-bd524ff59c95-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "52d83e52-4097-4c66-ad8b-bd524ff59c95" (UID: "52d83e52-4097-4c66-ad8b-bd524ff59c95"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:37:58.803075 master-0 kubenswrapper[13205]: I0319 09:37:58.803030 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_274c4bebf95a655851b2cf276fe43ef7/kube-apiserver-cert-syncer/0.log" Mar 19 09:37:58.803638 master-0 kubenswrapper[13205]: I0319 09:37:58.803606 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:58.804361 master-0 kubenswrapper[13205]: I0319 09:37:58.804321 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:58.804810 master-0 kubenswrapper[13205]: I0319 09:37:58.804777 13205 status_manager.go:851] "Failed to get status for pod" podUID="274c4bebf95a655851b2cf276fe43ef7" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:58.844220 master-0 kubenswrapper[13205]: I0319 09:37:58.844120 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-cert-dir\") pod \"274c4bebf95a655851b2cf276fe43ef7\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " Mar 19 09:37:58.844220 master-0 kubenswrapper[13205]: I0319 09:37:58.844209 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-resource-dir\") pod \"274c4bebf95a655851b2cf276fe43ef7\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " Mar 19 09:37:58.844723 master-0 kubenswrapper[13205]: I0319 09:37:58.844268 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "274c4bebf95a655851b2cf276fe43ef7" (UID: "274c4bebf95a655851b2cf276fe43ef7"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:58.844723 master-0 kubenswrapper[13205]: I0319 09:37:58.844326 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-audit-dir\") pod \"274c4bebf95a655851b2cf276fe43ef7\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " Mar 19 09:37:58.844723 master-0 kubenswrapper[13205]: I0319 09:37:58.844331 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "274c4bebf95a655851b2cf276fe43ef7" (UID: "274c4bebf95a655851b2cf276fe43ef7"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:58.844723 master-0 kubenswrapper[13205]: I0319 09:37:58.844428 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "274c4bebf95a655851b2cf276fe43ef7" (UID: "274c4bebf95a655851b2cf276fe43ef7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:58.844723 master-0 kubenswrapper[13205]: I0319 09:37:58.844665 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:58.844723 master-0 kubenswrapper[13205]: I0319 09:37:58.844680 13205 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:58.844723 master-0 kubenswrapper[13205]: I0319 09:37:58.844688 13205 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:58.844723 master-0 kubenswrapper[13205]: I0319 09:37:58.844697 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52d83e52-4097-4c66-ad8b-bd524ff59c95-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:58.858102 master-0 kubenswrapper[13205]: I0319 09:37:58.858030 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="274c4bebf95a655851b2cf276fe43ef7" path="/var/lib/kubelet/pods/274c4bebf95a655851b2cf276fe43ef7/volumes" Mar 19 09:37:59.210392 master-0 kubenswrapper[13205]: I0319 09:37:59.210169 13205 generic.go:334] "Generic (PLEG): container finished" podID="32c74216166e87f3b80af3f77a8bf69d" containerID="206863945d4c792c9f5ef1d537facfbfd6b9d7b3e40d27498e2adf8234947571" exitCode=0 Mar 19 09:37:59.211178 master-0 kubenswrapper[13205]: I0319 09:37:59.210321 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"32c74216166e87f3b80af3f77a8bf69d","Type":"ContainerDied","Data":"206863945d4c792c9f5ef1d537facfbfd6b9d7b3e40d27498e2adf8234947571"} Mar 19 09:37:59.211178 master-0 kubenswrapper[13205]: I0319 09:37:59.210709 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:37:59.211178 master-0 kubenswrapper[13205]: I0319 09:37:59.210752 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:37:59.211993 master-0 kubenswrapper[13205]: I0319 09:37:59.211919 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:59.212088 master-0 kubenswrapper[13205]: E0319 09:37:59.211926 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:59.217020 master-0 kubenswrapper[13205]: I0319 09:37:59.216958 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_274c4bebf95a655851b2cf276fe43ef7/kube-apiserver-cert-syncer/0.log" Mar 19 09:37:59.218354 master-0 kubenswrapper[13205]: I0319 09:37:59.218298 13205 generic.go:334] "Generic (PLEG): container finished" podID="274c4bebf95a655851b2cf276fe43ef7" containerID="9fdb611902d12b2a92243866d2f2faa5150f2625e0d4e605ba1a9f99f66b1118" exitCode=0 Mar 19 09:37:59.218460 master-0 kubenswrapper[13205]: I0319 09:37:59.218388 13205 scope.go:117] "RemoveContainer" containerID="076538bd4c7100a0612055eb8146732b5393bb4a40cba04223f3c3e0ec6d4f97" Mar 19 09:37:59.218460 master-0 kubenswrapper[13205]: I0319 09:37:59.218412 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:59.222218 master-0 kubenswrapper[13205]: I0319 09:37:59.222169 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:59.223188 master-0 kubenswrapper[13205]: I0319 09:37:59.223106 13205 status_manager.go:851] "Failed to get status for pod" podUID="274c4bebf95a655851b2cf276fe43ef7" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:59.223563 master-0 kubenswrapper[13205]: I0319 09:37:59.223431 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-6-master-0_cc0cf46f-a311-4083-9187-8fb45c1106dd/installer/0.log" Mar 19 09:37:59.223563 master-0 kubenswrapper[13205]: I0319 09:37:59.223485 13205 generic.go:334] "Generic (PLEG): container finished" podID="cc0cf46f-a311-4083-9187-8fb45c1106dd" containerID="671fc7de0fc2a6da81db9d09f766c6106d20f1507b4db40868e6077ac02cb6ad" exitCode=1 Mar 19 09:37:59.223563 master-0 kubenswrapper[13205]: I0319 09:37:59.223550 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-master-0" event={"ID":"cc0cf46f-a311-4083-9187-8fb45c1106dd","Type":"ContainerDied","Data":"671fc7de0fc2a6da81db9d09f766c6106d20f1507b4db40868e6077ac02cb6ad"} Mar 19 09:37:59.225215 master-0 kubenswrapper[13205]: I0319 09:37:59.224453 13205 status_manager.go:851] "Failed to get status for pod" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" pod="openshift-kube-controller-manager/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:59.226179 master-0 kubenswrapper[13205]: I0319 09:37:59.226108 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:59.227330 master-0 kubenswrapper[13205]: I0319 09:37:59.227229 13205 status_manager.go:851] "Failed to get status for pod" podUID="274c4bebf95a655851b2cf276fe43ef7" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:59.227460 master-0 kubenswrapper[13205]: I0319 09:37:59.227359 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-7-master-0" Mar 19 09:37:59.227836 master-0 kubenswrapper[13205]: I0319 09:37:59.227779 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-7-master-0" event={"ID":"52d83e52-4097-4c66-ad8b-bd524ff59c95","Type":"ContainerDied","Data":"c3a2278d98bb9dc4cc015efa7b01530c696c897749750a39371c05bff7dcb12f"} Mar 19 09:37:59.227836 master-0 kubenswrapper[13205]: I0319 09:37:59.227825 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3a2278d98bb9dc4cc015efa7b01530c696c897749750a39371c05bff7dcb12f" Mar 19 09:37:59.228666 master-0 kubenswrapper[13205]: I0319 09:37:59.228489 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:59.228666 master-0 kubenswrapper[13205]: E0319 09:37:59.228559 13205 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:59.229955 master-0 kubenswrapper[13205]: I0319 09:37:59.229362 13205 status_manager.go:851] "Failed to get status for pod" podUID="274c4bebf95a655851b2cf276fe43ef7" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:59.230723 master-0 kubenswrapper[13205]: I0319 09:37:59.230671 13205 status_manager.go:851] "Failed to get status for pod" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" pod="openshift-kube-controller-manager/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:59.231886 master-0 kubenswrapper[13205]: I0319 09:37:59.231808 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:59.232313 master-0 kubenswrapper[13205]: I0319 09:37:59.232262 13205 status_manager.go:851] "Failed to get status for pod" podUID="274c4bebf95a655851b2cf276fe43ef7" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:59.232890 master-0 kubenswrapper[13205]: I0319 09:37:59.232819 13205 status_manager.go:851] "Failed to get status for pod" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" pod="openshift-kube-controller-manager/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:59.251692 master-0 kubenswrapper[13205]: I0319 09:37:59.251625 13205 scope.go:117] "RemoveContainer" containerID="622c0abe71e7d2d01e416c3b3d9b79cd36de8ab264a68375ae57bedde186d74f" Mar 19 09:37:59.305551 master-0 kubenswrapper[13205]: I0319 09:37:59.305482 13205 scope.go:117] "RemoveContainer" containerID="dda2eeb89eba23a9330d36c63c58647d4e1a8606023fc7e613231cdbeef9e880" Mar 19 09:37:59.334776 master-0 kubenswrapper[13205]: I0319 09:37:59.334716 13205 scope.go:117] "RemoveContainer" containerID="f44908c788b64295105837947fd6893b0992d0ed4948253b4f5b18e1b7fa0acb" Mar 19 09:37:59.362818 master-0 kubenswrapper[13205]: I0319 09:37:59.362777 13205 scope.go:117] "RemoveContainer" containerID="9fdb611902d12b2a92243866d2f2faa5150f2625e0d4e605ba1a9f99f66b1118" Mar 19 09:37:59.386246 master-0 kubenswrapper[13205]: I0319 09:37:59.386183 13205 scope.go:117] "RemoveContainer" containerID="400fc105ac6b39b7a87f9cae46662b18bb8d873f2a121117d93d4586715760b7" Mar 19 09:37:59.408442 master-0 kubenswrapper[13205]: I0319 09:37:59.407602 13205 scope.go:117] "RemoveContainer" containerID="076538bd4c7100a0612055eb8146732b5393bb4a40cba04223f3c3e0ec6d4f97" Mar 19 09:37:59.408442 master-0 kubenswrapper[13205]: E0319 09:37:59.408107 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"076538bd4c7100a0612055eb8146732b5393bb4a40cba04223f3c3e0ec6d4f97\": container with ID starting with 076538bd4c7100a0612055eb8146732b5393bb4a40cba04223f3c3e0ec6d4f97 not found: ID does not exist" containerID="076538bd4c7100a0612055eb8146732b5393bb4a40cba04223f3c3e0ec6d4f97" Mar 19 09:37:59.408442 master-0 kubenswrapper[13205]: I0319 09:37:59.408174 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"076538bd4c7100a0612055eb8146732b5393bb4a40cba04223f3c3e0ec6d4f97"} err="failed to get container status \"076538bd4c7100a0612055eb8146732b5393bb4a40cba04223f3c3e0ec6d4f97\": rpc error: code = NotFound desc = could not find container \"076538bd4c7100a0612055eb8146732b5393bb4a40cba04223f3c3e0ec6d4f97\": container with ID starting with 076538bd4c7100a0612055eb8146732b5393bb4a40cba04223f3c3e0ec6d4f97 not found: ID does not exist" Mar 19 09:37:59.408442 master-0 kubenswrapper[13205]: I0319 09:37:59.408218 13205 scope.go:117] "RemoveContainer" containerID="622c0abe71e7d2d01e416c3b3d9b79cd36de8ab264a68375ae57bedde186d74f" Mar 19 09:37:59.410192 master-0 kubenswrapper[13205]: E0319 09:37:59.408677 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"622c0abe71e7d2d01e416c3b3d9b79cd36de8ab264a68375ae57bedde186d74f\": container with ID starting with 622c0abe71e7d2d01e416c3b3d9b79cd36de8ab264a68375ae57bedde186d74f not found: ID does not exist" containerID="622c0abe71e7d2d01e416c3b3d9b79cd36de8ab264a68375ae57bedde186d74f" Mar 19 09:37:59.410192 master-0 kubenswrapper[13205]: I0319 09:37:59.408712 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622c0abe71e7d2d01e416c3b3d9b79cd36de8ab264a68375ae57bedde186d74f"} err="failed to get container status \"622c0abe71e7d2d01e416c3b3d9b79cd36de8ab264a68375ae57bedde186d74f\": rpc error: code = NotFound desc = could not find container \"622c0abe71e7d2d01e416c3b3d9b79cd36de8ab264a68375ae57bedde186d74f\": container with ID starting with 622c0abe71e7d2d01e416c3b3d9b79cd36de8ab264a68375ae57bedde186d74f not found: ID does not exist" Mar 19 09:37:59.410192 master-0 kubenswrapper[13205]: I0319 09:37:59.408777 13205 scope.go:117] "RemoveContainer" containerID="dda2eeb89eba23a9330d36c63c58647d4e1a8606023fc7e613231cdbeef9e880" Mar 19 09:37:59.410192 master-0 kubenswrapper[13205]: E0319 09:37:59.409053 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda2eeb89eba23a9330d36c63c58647d4e1a8606023fc7e613231cdbeef9e880\": container with ID starting with dda2eeb89eba23a9330d36c63c58647d4e1a8606023fc7e613231cdbeef9e880 not found: ID does not exist" containerID="dda2eeb89eba23a9330d36c63c58647d4e1a8606023fc7e613231cdbeef9e880" Mar 19 09:37:59.410192 master-0 kubenswrapper[13205]: I0319 09:37:59.409078 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda2eeb89eba23a9330d36c63c58647d4e1a8606023fc7e613231cdbeef9e880"} err="failed to get container status \"dda2eeb89eba23a9330d36c63c58647d4e1a8606023fc7e613231cdbeef9e880\": rpc error: code = NotFound desc = could not find container \"dda2eeb89eba23a9330d36c63c58647d4e1a8606023fc7e613231cdbeef9e880\": container with ID starting with dda2eeb89eba23a9330d36c63c58647d4e1a8606023fc7e613231cdbeef9e880 not found: ID does not exist" Mar 19 09:37:59.410192 master-0 kubenswrapper[13205]: I0319 09:37:59.409098 13205 scope.go:117] "RemoveContainer" containerID="f44908c788b64295105837947fd6893b0992d0ed4948253b4f5b18e1b7fa0acb" Mar 19 09:37:59.410192 master-0 kubenswrapper[13205]: E0319 09:37:59.409607 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f44908c788b64295105837947fd6893b0992d0ed4948253b4f5b18e1b7fa0acb\": container with ID starting with f44908c788b64295105837947fd6893b0992d0ed4948253b4f5b18e1b7fa0acb not found: ID does not exist" containerID="f44908c788b64295105837947fd6893b0992d0ed4948253b4f5b18e1b7fa0acb" Mar 19 09:37:59.410192 master-0 kubenswrapper[13205]: I0319 09:37:59.409629 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f44908c788b64295105837947fd6893b0992d0ed4948253b4f5b18e1b7fa0acb"} err="failed to get container status \"f44908c788b64295105837947fd6893b0992d0ed4948253b4f5b18e1b7fa0acb\": rpc error: code = NotFound desc = could not find container \"f44908c788b64295105837947fd6893b0992d0ed4948253b4f5b18e1b7fa0acb\": container with ID starting with f44908c788b64295105837947fd6893b0992d0ed4948253b4f5b18e1b7fa0acb not found: ID does not exist" Mar 19 09:37:59.410192 master-0 kubenswrapper[13205]: I0319 09:37:59.409647 13205 scope.go:117] "RemoveContainer" containerID="9fdb611902d12b2a92243866d2f2faa5150f2625e0d4e605ba1a9f99f66b1118" Mar 19 09:37:59.410192 master-0 kubenswrapper[13205]: E0319 09:37:59.409923 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fdb611902d12b2a92243866d2f2faa5150f2625e0d4e605ba1a9f99f66b1118\": container with ID starting with 9fdb611902d12b2a92243866d2f2faa5150f2625e0d4e605ba1a9f99f66b1118 not found: ID does not exist" containerID="9fdb611902d12b2a92243866d2f2faa5150f2625e0d4e605ba1a9f99f66b1118" Mar 19 09:37:59.410192 master-0 kubenswrapper[13205]: I0319 09:37:59.409980 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fdb611902d12b2a92243866d2f2faa5150f2625e0d4e605ba1a9f99f66b1118"} err="failed to get container status \"9fdb611902d12b2a92243866d2f2faa5150f2625e0d4e605ba1a9f99f66b1118\": rpc error: code = NotFound desc = could not find container \"9fdb611902d12b2a92243866d2f2faa5150f2625e0d4e605ba1a9f99f66b1118\": container with ID starting with 9fdb611902d12b2a92243866d2f2faa5150f2625e0d4e605ba1a9f99f66b1118 not found: ID does not exist" Mar 19 09:37:59.410192 master-0 kubenswrapper[13205]: I0319 09:37:59.409997 13205 scope.go:117] "RemoveContainer" containerID="400fc105ac6b39b7a87f9cae46662b18bb8d873f2a121117d93d4586715760b7" Mar 19 09:37:59.410735 master-0 kubenswrapper[13205]: E0319 09:37:59.410286 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"400fc105ac6b39b7a87f9cae46662b18bb8d873f2a121117d93d4586715760b7\": container with ID starting with 400fc105ac6b39b7a87f9cae46662b18bb8d873f2a121117d93d4586715760b7 not found: ID does not exist" containerID="400fc105ac6b39b7a87f9cae46662b18bb8d873f2a121117d93d4586715760b7" Mar 19 09:37:59.410735 master-0 kubenswrapper[13205]: I0319 09:37:59.410327 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400fc105ac6b39b7a87f9cae46662b18bb8d873f2a121117d93d4586715760b7"} err="failed to get container status \"400fc105ac6b39b7a87f9cae46662b18bb8d873f2a121117d93d4586715760b7\": rpc error: code = NotFound desc = could not find container \"400fc105ac6b39b7a87f9cae46662b18bb8d873f2a121117d93d4586715760b7\": container with ID starting with 400fc105ac6b39b7a87f9cae46662b18bb8d873f2a121117d93d4586715760b7 not found: ID does not exist" Mar 19 09:38:00.247424 master-0 kubenswrapper[13205]: I0319 09:38:00.247359 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"32c74216166e87f3b80af3f77a8bf69d","Type":"ContainerStarted","Data":"27142146c25b572f73c8b611fe230aca9b600472828b4ced3703dbf9f592e3cf"} Mar 19 09:38:00.247424 master-0 kubenswrapper[13205]: I0319 09:38:00.247416 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"32c74216166e87f3b80af3f77a8bf69d","Type":"ContainerStarted","Data":"967fa627ec262a32337d8cef24a37e02904587fa8f59830c86e09b1baf77eff0"} Mar 19 09:38:00.247424 master-0 kubenswrapper[13205]: I0319 09:38:00.247430 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"32c74216166e87f3b80af3f77a8bf69d","Type":"ContainerStarted","Data":"6ba023c4874011bc7c5c7f6971a4db9ae95e3f9a84091b1929f56c1c2eebf596"} Mar 19 09:38:00.248136 master-0 kubenswrapper[13205]: I0319 09:38:00.247497 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:38:00.248136 master-0 kubenswrapper[13205]: I0319 09:38:00.247627 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:38:00.248136 master-0 kubenswrapper[13205]: I0319 09:38:00.247677 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:38:00.248919 master-0 kubenswrapper[13205]: I0319 09:38:00.248569 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:00.248919 master-0 kubenswrapper[13205]: E0319 09:38:00.248602 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:38:00.249797 master-0 kubenswrapper[13205]: I0319 09:38:00.249758 13205 status_manager.go:851] "Failed to get status for pod" podUID="274c4bebf95a655851b2cf276fe43ef7" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:00.250233 master-0 kubenswrapper[13205]: I0319 09:38:00.250192 13205 status_manager.go:851] "Failed to get status for pod" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" pod="openshift-kube-controller-manager/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:00.717645 master-0 kubenswrapper[13205]: I0319 09:38:00.717582 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-6-master-0_cc0cf46f-a311-4083-9187-8fb45c1106dd/installer/0.log" Mar 19 09:38:00.717645 master-0 kubenswrapper[13205]: I0319 09:38:00.717654 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:38:00.718739 master-0 kubenswrapper[13205]: I0319 09:38:00.718681 13205 status_manager.go:851] "Failed to get status for pod" podUID="274c4bebf95a655851b2cf276fe43ef7" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:00.719283 master-0 kubenswrapper[13205]: I0319 09:38:00.719214 13205 status_manager.go:851] "Failed to get status for pod" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" pod="openshift-kube-controller-manager/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:00.719881 master-0 kubenswrapper[13205]: I0319 09:38:00.719817 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:00.780056 master-0 kubenswrapper[13205]: I0319 09:38:00.779975 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc0cf46f-a311-4083-9187-8fb45c1106dd-kube-api-access\") pod \"cc0cf46f-a311-4083-9187-8fb45c1106dd\" (UID: \"cc0cf46f-a311-4083-9187-8fb45c1106dd\") " Mar 19 09:38:00.780446 master-0 kubenswrapper[13205]: I0319 09:38:00.780118 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc0cf46f-a311-4083-9187-8fb45c1106dd-kubelet-dir\") pod \"cc0cf46f-a311-4083-9187-8fb45c1106dd\" (UID: \"cc0cf46f-a311-4083-9187-8fb45c1106dd\") " Mar 19 09:38:00.780446 master-0 kubenswrapper[13205]: I0319 09:38:00.780191 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cc0cf46f-a311-4083-9187-8fb45c1106dd-var-lock\") pod \"cc0cf46f-a311-4083-9187-8fb45c1106dd\" (UID: \"cc0cf46f-a311-4083-9187-8fb45c1106dd\") " Mar 19 09:38:00.780446 master-0 kubenswrapper[13205]: I0319 09:38:00.780199 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc0cf46f-a311-4083-9187-8fb45c1106dd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cc0cf46f-a311-4083-9187-8fb45c1106dd" (UID: "cc0cf46f-a311-4083-9187-8fb45c1106dd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:38:00.780446 master-0 kubenswrapper[13205]: I0319 09:38:00.780248 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc0cf46f-a311-4083-9187-8fb45c1106dd-var-lock" (OuterVolumeSpecName: "var-lock") pod "cc0cf46f-a311-4083-9187-8fb45c1106dd" (UID: "cc0cf46f-a311-4083-9187-8fb45c1106dd"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:38:00.780446 master-0 kubenswrapper[13205]: I0319 09:38:00.780447 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cc0cf46f-a311-4083-9187-8fb45c1106dd-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:38:00.780679 master-0 kubenswrapper[13205]: I0319 09:38:00.780460 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc0cf46f-a311-4083-9187-8fb45c1106dd-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:38:00.782812 master-0 kubenswrapper[13205]: I0319 09:38:00.782755 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc0cf46f-a311-4083-9187-8fb45c1106dd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cc0cf46f-a311-4083-9187-8fb45c1106dd" (UID: "cc0cf46f-a311-4083-9187-8fb45c1106dd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:38:00.874278 master-0 kubenswrapper[13205]: E0319 09:38:00.874204 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:00.874949 master-0 kubenswrapper[13205]: E0319 09:38:00.874888 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:00.875902 master-0 kubenswrapper[13205]: E0319 09:38:00.875772 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:00.877478 master-0 kubenswrapper[13205]: E0319 09:38:00.876631 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:00.877951 master-0 kubenswrapper[13205]: E0319 09:38:00.877904 13205 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:00.877951 master-0 kubenswrapper[13205]: I0319 09:38:00.877943 13205 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 09:38:00.878618 master-0 kubenswrapper[13205]: E0319 09:38:00.878564 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 09:38:00.881462 master-0 kubenswrapper[13205]: I0319 09:38:00.881421 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc0cf46f-a311-4083-9187-8fb45c1106dd-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:38:01.080607 master-0 kubenswrapper[13205]: E0319 09:38:01.080498 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 09:38:01.262266 master-0 kubenswrapper[13205]: I0319 09:38:01.262232 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-6-master-0_cc0cf46f-a311-4083-9187-8fb45c1106dd/installer/0.log" Mar 19 09:38:01.263023 master-0 kubenswrapper[13205]: I0319 09:38:01.262960 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-master-0" event={"ID":"cc0cf46f-a311-4083-9187-8fb45c1106dd","Type":"ContainerDied","Data":"39241e91aac9966a0b4e28caf7d560bf716f12dd0e19034be62408df8a6de801"} Mar 19 09:38:01.263023 master-0 kubenswrapper[13205]: I0319 09:38:01.263006 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-master-0" Mar 19 09:38:01.263166 master-0 kubenswrapper[13205]: I0319 09:38:01.263033 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39241e91aac9966a0b4e28caf7d560bf716f12dd0e19034be62408df8a6de801" Mar 19 09:38:01.263494 master-0 kubenswrapper[13205]: I0319 09:38:01.263473 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:38:01.263610 master-0 kubenswrapper[13205]: I0319 09:38:01.263595 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:38:01.264282 master-0 kubenswrapper[13205]: E0319 09:38:01.264230 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:38:01.267459 master-0 kubenswrapper[13205]: I0319 09:38:01.267397 13205 status_manager.go:851] "Failed to get status for pod" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" pod="openshift-kube-controller-manager/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:01.268106 master-0 kubenswrapper[13205]: I0319 09:38:01.268065 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:01.487848 master-0 kubenswrapper[13205]: E0319 09:38:01.487680 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 09:38:02.289215 master-0 kubenswrapper[13205]: E0319 09:38:02.289155 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 09:38:03.891248 master-0 kubenswrapper[13205]: E0319 09:38:03.891158 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 09:38:04.856647 master-0 kubenswrapper[13205]: I0319 09:38:04.856418 13205 status_manager.go:851] "Failed to get status for pod" podUID="32c74216166e87f3b80af3f77a8bf69d" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:04.857979 master-0 kubenswrapper[13205]: I0319 09:38:04.857807 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:04.859461 master-0 kubenswrapper[13205]: I0319 09:38:04.859326 13205 status_manager.go:851] "Failed to get status for pod" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" pod="openshift-kube-controller-manager/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:06.037140 master-0 kubenswrapper[13205]: E0319 09:38:06.036944 13205 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e348b7a5c0286 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:7a4744531cb137d7252790be662d8cc8,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:37:56.888502918 +0000 UTC m=+862.220809826,LastTimestamp:2026-03-19 09:37:56.888502918 +0000 UTC m=+862.220809826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:38:07.092706 master-0 kubenswrapper[13205]: E0319 09:38:07.092615 13205 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 19 09:38:10.342173 master-0 kubenswrapper[13205]: I0319 09:38:10.342057 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_78163c60e5607dc0ccb2f836459711da/kube-controller-manager/0.log" Mar 19 09:38:10.342173 master-0 kubenswrapper[13205]: I0319 09:38:10.342158 13205 generic.go:334] "Generic (PLEG): container finished" podID="78163c60e5607dc0ccb2f836459711da" containerID="9891f7b5295dc9e748541b1d5291c66e77a0ec82f3b11cb284bbd29bce4baf72" exitCode=1 Mar 19 09:38:10.343075 master-0 kubenswrapper[13205]: I0319 09:38:10.342219 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"78163c60e5607dc0ccb2f836459711da","Type":"ContainerDied","Data":"9891f7b5295dc9e748541b1d5291c66e77a0ec82f3b11cb284bbd29bce4baf72"} Mar 19 09:38:10.343262 master-0 kubenswrapper[13205]: I0319 09:38:10.343196 13205 scope.go:117] "RemoveContainer" containerID="9891f7b5295dc9e748541b1d5291c66e77a0ec82f3b11cb284bbd29bce4baf72" Mar 19 09:38:10.343856 master-0 kubenswrapper[13205]: I0319 09:38:10.343761 13205 status_manager.go:851] "Failed to get status for pod" podUID="32c74216166e87f3b80af3f77a8bf69d" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:10.345241 master-0 kubenswrapper[13205]: I0319 09:38:10.344714 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:10.345858 master-0 kubenswrapper[13205]: I0319 09:38:10.345777 13205 status_manager.go:851] "Failed to get status for pod" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" pod="openshift-kube-controller-manager/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:10.347128 master-0 kubenswrapper[13205]: I0319 09:38:10.347046 13205 status_manager.go:851] "Failed to get status for pod" podUID="78163c60e5607dc0ccb2f836459711da" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:11.352984 master-0 kubenswrapper[13205]: I0319 09:38:11.352922 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_78163c60e5607dc0ccb2f836459711da/kube-controller-manager/0.log" Mar 19 09:38:11.352984 master-0 kubenswrapper[13205]: I0319 09:38:11.352981 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"78163c60e5607dc0ccb2f836459711da","Type":"ContainerStarted","Data":"34566cfceb793a1b567a5c645aa383f0affc1644709bb43d497052e54db18d78"} Mar 19 09:38:11.354347 master-0 kubenswrapper[13205]: I0319 09:38:11.354264 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:11.355061 master-0 kubenswrapper[13205]: I0319 09:38:11.354985 13205 status_manager.go:851] "Failed to get status for pod" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" pod="openshift-kube-controller-manager/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:11.355722 master-0 kubenswrapper[13205]: I0319 09:38:11.355643 13205 status_manager.go:851] "Failed to get status for pod" podUID="78163c60e5607dc0ccb2f836459711da" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:11.356309 master-0 kubenswrapper[13205]: I0319 09:38:11.356250 13205 status_manager.go:851] "Failed to get status for pod" podUID="32c74216166e87f3b80af3f77a8bf69d" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:11.849061 master-0 kubenswrapper[13205]: I0319 09:38:11.849004 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:38:11.851717 master-0 kubenswrapper[13205]: I0319 09:38:11.851620 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:11.852698 master-0 kubenswrapper[13205]: I0319 09:38:11.852640 13205 status_manager.go:851] "Failed to get status for pod" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" pod="openshift-kube-controller-manager/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:11.854989 master-0 kubenswrapper[13205]: I0319 09:38:11.854876 13205 status_manager.go:851] "Failed to get status for pod" podUID="78163c60e5607dc0ccb2f836459711da" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:11.855972 master-0 kubenswrapper[13205]: I0319 09:38:11.855906 13205 status_manager.go:851] "Failed to get status for pod" podUID="32c74216166e87f3b80af3f77a8bf69d" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:11.886958 master-0 kubenswrapper[13205]: I0319 09:38:11.886844 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3a14cb29-e2eb-41ac-b6fa-8fc7e6411416" Mar 19 09:38:11.886958 master-0 kubenswrapper[13205]: I0319 09:38:11.886904 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3a14cb29-e2eb-41ac-b6fa-8fc7e6411416" Mar 19 09:38:11.888234 master-0 kubenswrapper[13205]: E0319 09:38:11.888165 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:38:11.889136 master-0 kubenswrapper[13205]: I0319 09:38:11.889011 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:38:11.923418 master-0 kubenswrapper[13205]: W0319 09:38:11.923343 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cae843f2a8e3c3c3212b1177305c1d5.slice/crio-6dd6315ccf6cb629d9be7efdda961aba8623aae087b839bd7a08a45624918d74 WatchSource:0}: Error finding container 6dd6315ccf6cb629d9be7efdda961aba8623aae087b839bd7a08a45624918d74: Status 404 returned error can't find the container with id 6dd6315ccf6cb629d9be7efdda961aba8623aae087b839bd7a08a45624918d74 Mar 19 09:38:12.367235 master-0 kubenswrapper[13205]: I0319 09:38:12.367102 13205 generic.go:334] "Generic (PLEG): container finished" podID="3cae843f2a8e3c3c3212b1177305c1d5" containerID="547243f1428a8642ba3a64623d2a41575b5ac14fdff3e59290a35c5ef342c46e" exitCode=0 Mar 19 09:38:12.367235 master-0 kubenswrapper[13205]: I0319 09:38:12.367193 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerDied","Data":"547243f1428a8642ba3a64623d2a41575b5ac14fdff3e59290a35c5ef342c46e"} Mar 19 09:38:12.367833 master-0 kubenswrapper[13205]: I0319 09:38:12.367260 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerStarted","Data":"6dd6315ccf6cb629d9be7efdda961aba8623aae087b839bd7a08a45624918d74"} Mar 19 09:38:12.367892 master-0 kubenswrapper[13205]: I0319 09:38:12.367833 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3a14cb29-e2eb-41ac-b6fa-8fc7e6411416" Mar 19 09:38:12.367892 master-0 kubenswrapper[13205]: I0319 09:38:12.367866 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3a14cb29-e2eb-41ac-b6fa-8fc7e6411416" Mar 19 09:38:12.369029 master-0 kubenswrapper[13205]: E0319 09:38:12.368952 13205 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:38:12.369155 master-0 kubenswrapper[13205]: I0319 09:38:12.368969 13205 status_manager.go:851] "Failed to get status for pod" podUID="32c74216166e87f3b80af3f77a8bf69d" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:12.370034 master-0 kubenswrapper[13205]: I0319 09:38:12.369941 13205 status_manager.go:851] "Failed to get status for pod" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:12.371353 master-0 kubenswrapper[13205]: I0319 09:38:12.371235 13205 status_manager.go:851] "Failed to get status for pod" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" pod="openshift-kube-controller-manager/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:12.372438 master-0 kubenswrapper[13205]: I0319 09:38:12.372355 13205 status_manager.go:851] "Failed to get status for pod" podUID="78163c60e5607dc0ccb2f836459711da" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:38:13.377906 master-0 kubenswrapper[13205]: I0319 09:38:13.377865 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerStarted","Data":"6e415400e0d2fe47bd725d39c69a59898f71075a8fbc0a70879b15985433ec4d"} Mar 19 09:38:13.378356 master-0 kubenswrapper[13205]: I0319 09:38:13.377912 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerStarted","Data":"70f1c8227f934db913f64140042b8e479f5a1ea89a50e78e4c8da89e28837188"} Mar 19 09:38:13.378356 master-0 kubenswrapper[13205]: I0319 09:38:13.377923 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerStarted","Data":"371f7bf3fb45aab55d89797fcf336617e76c7478e8f6c1c935700bae68b19481"} Mar 19 09:38:14.386830 master-0 kubenswrapper[13205]: I0319 09:38:14.386776 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerStarted","Data":"a520e7431b9d661f96eed4f852d4e97d9b8ade64eaf413a45856cda5dfd9186d"} Mar 19 09:38:14.386830 master-0 kubenswrapper[13205]: I0319 09:38:14.386821 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"3cae843f2a8e3c3c3212b1177305c1d5","Type":"ContainerStarted","Data":"03f3694c3474460e8419a7a8d0b783cbf47b690e965f8d338dbcb1b8546fd937"} Mar 19 09:38:14.387490 master-0 kubenswrapper[13205]: I0319 09:38:14.387038 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3a14cb29-e2eb-41ac-b6fa-8fc7e6411416" Mar 19 09:38:14.387490 master-0 kubenswrapper[13205]: I0319 09:38:14.387051 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3a14cb29-e2eb-41ac-b6fa-8fc7e6411416" Mar 19 09:38:14.387490 master-0 kubenswrapper[13205]: I0319 09:38:14.387213 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:38:15.888403 master-0 kubenswrapper[13205]: I0319 09:38:15.888345 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:38:15.893612 master-0 kubenswrapper[13205]: I0319 09:38:15.890748 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:38:15.898001 master-0 kubenswrapper[13205]: I0319 09:38:15.897957 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:38:16.889839 master-0 kubenswrapper[13205]: I0319 09:38:16.889747 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:38:16.889839 master-0 kubenswrapper[13205]: I0319 09:38:16.889815 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:38:16.897443 master-0 kubenswrapper[13205]: I0319 09:38:16.897400 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:38:19.400948 master-0 kubenswrapper[13205]: I0319 09:38:19.400896 13205 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:38:19.433982 master-0 kubenswrapper[13205]: I0319 09:38:19.433927 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3a14cb29-e2eb-41ac-b6fa-8fc7e6411416" Mar 19 09:38:19.433982 master-0 kubenswrapper[13205]: I0319 09:38:19.433959 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3a14cb29-e2eb-41ac-b6fa-8fc7e6411416" Mar 19 09:38:19.439244 master-0 kubenswrapper[13205]: I0319 09:38:19.439189 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:38:19.507164 master-0 kubenswrapper[13205]: I0319 09:38:19.507104 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="3cae843f2a8e3c3c3212b1177305c1d5" podUID="98a5f004-3409-410e-a650-137f250e2bea" Mar 19 09:38:20.441487 master-0 kubenswrapper[13205]: I0319 09:38:20.441426 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3a14cb29-e2eb-41ac-b6fa-8fc7e6411416" Mar 19 09:38:20.441487 master-0 kubenswrapper[13205]: I0319 09:38:20.441464 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3a14cb29-e2eb-41ac-b6fa-8fc7e6411416" Mar 19 09:38:22.738427 master-0 kubenswrapper[13205]: I0319 09:38:22.738346 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:38:22.788303 master-0 kubenswrapper[13205]: I0319 09:38:22.788239 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:38:23.510114 master-0 kubenswrapper[13205]: I0319 09:38:23.510009 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:38:24.879170 master-0 kubenswrapper[13205]: I0319 09:38:24.879063 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="3cae843f2a8e3c3c3212b1177305c1d5" podUID="98a5f004-3409-410e-a650-137f250e2bea" Mar 19 09:38:25.896871 master-0 kubenswrapper[13205]: I0319 09:38:25.896798 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:38:28.842326 master-0 kubenswrapper[13205]: I0319 09:38:28.842265 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 09:38:28.887508 master-0 kubenswrapper[13205]: I0319 09:38:28.887430 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 09:38:29.149758 master-0 kubenswrapper[13205]: I0319 09:38:29.149593 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 09:38:29.248511 master-0 kubenswrapper[13205]: I0319 09:38:29.248432 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 09:38:29.291712 master-0 kubenswrapper[13205]: I0319 09:38:29.291646 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 19 09:38:29.305721 master-0 kubenswrapper[13205]: I0319 09:38:29.305655 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:38:29.460433 master-0 kubenswrapper[13205]: I0319 09:38:29.460165 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 09:38:29.750305 master-0 kubenswrapper[13205]: I0319 09:38:29.749946 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 09:38:29.773742 master-0 kubenswrapper[13205]: I0319 09:38:29.773674 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-be6kne0s3lnpg" Mar 19 09:38:29.988655 master-0 kubenswrapper[13205]: I0319 09:38:29.988512 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:38:30.363720 master-0 kubenswrapper[13205]: I0319 09:38:30.363666 13205 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:38:30.380414 master-0 kubenswrapper[13205]: I0319 09:38:30.380375 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"tuned-dockercfg-czcgc" Mar 19 09:38:30.464348 master-0 kubenswrapper[13205]: I0319 09:38:30.464311 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-khk7h" Mar 19 09:38:30.479012 master-0 kubenswrapper[13205]: I0319 09:38:30.478938 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:38:30.595258 master-0 kubenswrapper[13205]: I0319 09:38:30.595198 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 09:38:30.674093 master-0 kubenswrapper[13205]: I0319 09:38:30.673947 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 19 09:38:30.764417 master-0 kubenswrapper[13205]: I0319 09:38:30.760591 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 09:38:30.975708 master-0 kubenswrapper[13205]: I0319 09:38:30.975497 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 09:38:31.099210 master-0 kubenswrapper[13205]: I0319 09:38:31.099137 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 09:38:31.175012 master-0 kubenswrapper[13205]: I0319 09:38:31.174944 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:38:31.255063 master-0 kubenswrapper[13205]: I0319 09:38:31.254899 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:38:31.357165 master-0 kubenswrapper[13205]: I0319 09:38:31.357086 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 09:38:31.462799 master-0 kubenswrapper[13205]: I0319 09:38:31.462732 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-hwfkt" Mar 19 09:38:31.608668 master-0 kubenswrapper[13205]: I0319 09:38:31.608625 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-xnqzt" Mar 19 09:38:31.769488 master-0 kubenswrapper[13205]: I0319 09:38:31.769431 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 09:38:31.896157 master-0 kubenswrapper[13205]: I0319 09:38:31.896028 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 09:38:31.933911 master-0 kubenswrapper[13205]: I0319 09:38:31.933873 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 09:38:31.982131 master-0 kubenswrapper[13205]: I0319 09:38:31.982063 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:38:32.108693 master-0 kubenswrapper[13205]: I0319 09:38:32.108641 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:38:32.211039 master-0 kubenswrapper[13205]: I0319 09:38:32.210927 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 09:38:32.218965 master-0 kubenswrapper[13205]: I0319 09:38:32.218887 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:38:32.230976 master-0 kubenswrapper[13205]: I0319 09:38:32.230933 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 09:38:32.239479 master-0 kubenswrapper[13205]: I0319 09:38:32.239457 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-bxhvs" Mar 19 09:38:32.265720 master-0 kubenswrapper[13205]: I0319 09:38:32.265661 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 19 09:38:32.418106 master-0 kubenswrapper[13205]: I0319 09:38:32.418028 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 09:38:32.568975 master-0 kubenswrapper[13205]: I0319 09:38:32.568898 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:38:32.667418 master-0 kubenswrapper[13205]: I0319 09:38:32.667342 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 09:38:32.688316 master-0 kubenswrapper[13205]: I0319 09:38:32.688249 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-rcdtx" Mar 19 09:38:32.850060 master-0 kubenswrapper[13205]: I0319 09:38:32.849879 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:38:32.883630 master-0 kubenswrapper[13205]: I0319 09:38:32.883559 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 09:38:32.938513 master-0 kubenswrapper[13205]: I0319 09:38:32.938427 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:38:33.002979 master-0 kubenswrapper[13205]: I0319 09:38:33.002925 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:38:33.122476 master-0 kubenswrapper[13205]: I0319 09:38:33.122312 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 09:38:33.145423 master-0 kubenswrapper[13205]: I0319 09:38:33.145329 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:38:33.160762 master-0 kubenswrapper[13205]: I0319 09:38:33.160672 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 09:38:33.237654 master-0 kubenswrapper[13205]: I0319 09:38:33.237569 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:38:33.247468 master-0 kubenswrapper[13205]: I0319 09:38:33.247400 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:38:33.289718 master-0 kubenswrapper[13205]: I0319 09:38:33.289613 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:38:33.300776 master-0 kubenswrapper[13205]: I0319 09:38:33.300709 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 09:38:33.389942 master-0 kubenswrapper[13205]: I0319 09:38:33.389823 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 09:38:33.419552 master-0 kubenswrapper[13205]: I0319 09:38:33.419437 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 09:38:33.430904 master-0 kubenswrapper[13205]: I0319 09:38:33.430851 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:38:33.446576 master-0 kubenswrapper[13205]: I0319 09:38:33.446464 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 09:38:33.525770 master-0 kubenswrapper[13205]: I0319 09:38:33.525695 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:38:33.541114 master-0 kubenswrapper[13205]: I0319 09:38:33.541036 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-gpc6r" Mar 19 09:38:33.666200 master-0 kubenswrapper[13205]: I0319 09:38:33.666024 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:38:33.673651 master-0 kubenswrapper[13205]: I0319 09:38:33.673609 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 09:38:33.682812 master-0 kubenswrapper[13205]: I0319 09:38:33.682749 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 09:38:33.747788 master-0 kubenswrapper[13205]: I0319 09:38:33.747739 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:38:33.761025 master-0 kubenswrapper[13205]: I0319 09:38:33.760960 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 09:38:33.773462 master-0 kubenswrapper[13205]: I0319 09:38:33.773403 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:38:33.809839 master-0 kubenswrapper[13205]: I0319 09:38:33.809753 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:38:33.810769 master-0 kubenswrapper[13205]: I0319 09:38:33.810720 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-p5dd4" Mar 19 09:38:33.833625 master-0 kubenswrapper[13205]: I0319 09:38:33.833572 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 09:38:33.856325 master-0 kubenswrapper[13205]: I0319 09:38:33.856274 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 09:38:33.857212 master-0 kubenswrapper[13205]: I0319 09:38:33.857179 13205 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:38:33.941875 master-0 kubenswrapper[13205]: I0319 09:38:33.941738 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:38:33.974924 master-0 kubenswrapper[13205]: I0319 09:38:33.973075 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 09:38:33.986733 master-0 kubenswrapper[13205]: I0319 09:38:33.986677 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 09:38:34.012513 master-0 kubenswrapper[13205]: I0319 09:38:34.012433 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-pvd7b" Mar 19 09:38:34.052470 master-0 kubenswrapper[13205]: I0319 09:38:34.052376 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 09:38:34.071200 master-0 kubenswrapper[13205]: I0319 09:38:34.071126 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 09:38:34.110323 master-0 kubenswrapper[13205]: I0319 09:38:34.110244 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:38:34.172168 master-0 kubenswrapper[13205]: I0319 09:38:34.172081 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:38:34.234874 master-0 kubenswrapper[13205]: I0319 09:38:34.234712 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 19 09:38:34.250076 master-0 kubenswrapper[13205]: I0319 09:38:34.249984 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 09:38:34.310667 master-0 kubenswrapper[13205]: I0319 09:38:34.310598 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:38:34.313393 master-0 kubenswrapper[13205]: I0319 09:38:34.313329 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 09:38:34.317229 master-0 kubenswrapper[13205]: I0319 09:38:34.317178 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:38:34.430786 master-0 kubenswrapper[13205]: I0319 09:38:34.430705 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:38:34.511622 master-0 kubenswrapper[13205]: I0319 09:38:34.511394 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:38:34.519370 master-0 kubenswrapper[13205]: I0319 09:38:34.519305 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-2rqbc" Mar 19 09:38:34.519575 master-0 kubenswrapper[13205]: I0319 09:38:34.519427 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 09:38:34.539976 master-0 kubenswrapper[13205]: I0319 09:38:34.539921 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:38:34.590131 master-0 kubenswrapper[13205]: I0319 09:38:34.590060 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:38:34.599757 master-0 kubenswrapper[13205]: I0319 09:38:34.599700 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:38:34.659416 master-0 kubenswrapper[13205]: I0319 09:38:34.659359 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 09:38:34.659816 master-0 kubenswrapper[13205]: I0319 09:38:34.659782 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:38:34.665332 master-0 kubenswrapper[13205]: I0319 09:38:34.665269 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 09:38:34.677937 master-0 kubenswrapper[13205]: I0319 09:38:34.677891 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 09:38:34.763219 master-0 kubenswrapper[13205]: I0319 09:38:34.763051 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:38:34.778944 master-0 kubenswrapper[13205]: I0319 09:38:34.778877 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 09:38:34.792298 master-0 kubenswrapper[13205]: I0319 09:38:34.792230 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:38:34.858709 master-0 kubenswrapper[13205]: I0319 09:38:34.858621 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:38:34.865488 master-0 kubenswrapper[13205]: I0319 09:38:34.861969 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 09:38:35.034637 master-0 kubenswrapper[13205]: I0319 09:38:35.034552 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 19 09:38:35.128913 master-0 kubenswrapper[13205]: I0319 09:38:35.128854 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:38:35.147557 master-0 kubenswrapper[13205]: I0319 09:38:35.147467 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-hnn89" Mar 19 09:38:35.163394 master-0 kubenswrapper[13205]: I0319 09:38:35.163346 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 09:38:35.219517 master-0 kubenswrapper[13205]: E0319 09:38:35.219451 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:38:35.226472 master-0 kubenswrapper[13205]: I0319 09:38:35.226391 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 09:38:35.261728 master-0 kubenswrapper[13205]: I0319 09:38:35.261643 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 19 09:38:35.281644 master-0 kubenswrapper[13205]: I0319 09:38:35.281501 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:38:35.302242 master-0 kubenswrapper[13205]: I0319 09:38:35.302068 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:38:35.341765 master-0 kubenswrapper[13205]: I0319 09:38:35.341651 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 09:38:35.468351 master-0 kubenswrapper[13205]: I0319 09:38:35.468269 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:38:35.543901 master-0 kubenswrapper[13205]: I0319 09:38:35.543809 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 09:38:35.601934 master-0 kubenswrapper[13205]: I0319 09:38:35.601765 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-6j7vofh1gbciq" Mar 19 09:38:35.606249 master-0 kubenswrapper[13205]: I0319 09:38:35.606192 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 09:38:35.624920 master-0 kubenswrapper[13205]: I0319 09:38:35.624855 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:38:35.632709 master-0 kubenswrapper[13205]: I0319 09:38:35.632667 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:38:35.682506 master-0 kubenswrapper[13205]: I0319 09:38:35.682444 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:38:35.742922 master-0 kubenswrapper[13205]: I0319 09:38:35.742866 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:38:35.782722 master-0 kubenswrapper[13205]: I0319 09:38:35.782682 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 09:38:35.819192 master-0 kubenswrapper[13205]: I0319 09:38:35.819166 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 09:38:35.863791 master-0 kubenswrapper[13205]: I0319 09:38:35.863708 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 09:38:35.883345 master-0 kubenswrapper[13205]: I0319 09:38:35.883308 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-lpljf" Mar 19 09:38:35.907783 master-0 kubenswrapper[13205]: I0319 09:38:35.907724 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 09:38:35.981175 master-0 kubenswrapper[13205]: I0319 09:38:35.981117 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 09:38:36.027674 master-0 kubenswrapper[13205]: I0319 09:38:36.025185 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-zbzf5" Mar 19 09:38:36.062225 master-0 kubenswrapper[13205]: I0319 09:38:36.062150 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 09:38:36.066089 master-0 kubenswrapper[13205]: I0319 09:38:36.066027 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 19 09:38:36.124098 master-0 kubenswrapper[13205]: I0319 09:38:36.123925 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:38:36.124380 master-0 kubenswrapper[13205]: I0319 09:38:36.124094 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:38:36.231057 master-0 kubenswrapper[13205]: I0319 09:38:36.230977 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:38:36.263420 master-0 kubenswrapper[13205]: I0319 09:38:36.263355 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:38:36.265819 master-0 kubenswrapper[13205]: I0319 09:38:36.265786 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:38:36.289725 master-0 kubenswrapper[13205]: I0319 09:38:36.289670 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 09:38:36.289725 master-0 kubenswrapper[13205]: I0319 09:38:36.289681 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-fdmh2" Mar 19 09:38:36.338923 master-0 kubenswrapper[13205]: I0319 09:38:36.338856 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:38:36.372788 master-0 kubenswrapper[13205]: I0319 09:38:36.372682 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-7dv6h" Mar 19 09:38:36.462129 master-0 kubenswrapper[13205]: I0319 09:38:36.461755 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:38:36.606655 master-0 kubenswrapper[13205]: I0319 09:38:36.606583 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 09:38:36.735602 master-0 kubenswrapper[13205]: I0319 09:38:36.735390 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:38:36.742335 master-0 kubenswrapper[13205]: I0319 09:38:36.742256 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 09:38:36.748570 master-0 kubenswrapper[13205]: I0319 09:38:36.745966 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 09:38:36.757979 master-0 kubenswrapper[13205]: I0319 09:38:36.757934 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 09:38:36.759114 master-0 kubenswrapper[13205]: I0319 09:38:36.758167 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 09:38:36.823898 master-0 kubenswrapper[13205]: I0319 09:38:36.823760 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 09:38:36.857046 master-0 kubenswrapper[13205]: I0319 09:38:36.856948 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 09:38:36.887343 master-0 kubenswrapper[13205]: I0319 09:38:36.887265 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:38:36.960737 master-0 kubenswrapper[13205]: I0319 09:38:36.960620 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:38:36.969376 master-0 kubenswrapper[13205]: I0319 09:38:36.969313 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 09:38:36.983674 master-0 kubenswrapper[13205]: I0319 09:38:36.983609 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 09:38:36.998758 master-0 kubenswrapper[13205]: I0319 09:38:36.998590 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-92z97" Mar 19 09:38:37.072475 master-0 kubenswrapper[13205]: I0319 09:38:37.072403 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 09:38:37.083681 master-0 kubenswrapper[13205]: I0319 09:38:37.083624 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 09:38:37.089055 master-0 kubenswrapper[13205]: I0319 09:38:37.089001 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-s9ktx" Mar 19 09:38:37.111643 master-0 kubenswrapper[13205]: I0319 09:38:37.111580 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 09:38:37.122216 master-0 kubenswrapper[13205]: I0319 09:38:37.122153 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 09:38:37.126673 master-0 kubenswrapper[13205]: I0319 09:38:37.126613 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:38:37.154437 master-0 kubenswrapper[13205]: I0319 09:38:37.153274 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:38:37.248445 master-0 kubenswrapper[13205]: I0319 09:38:37.248358 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:38:37.288570 master-0 kubenswrapper[13205]: I0319 09:38:37.288447 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-p8jxl" Mar 19 09:38:37.303616 master-0 kubenswrapper[13205]: I0319 09:38:37.303499 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 09:38:37.403865 master-0 kubenswrapper[13205]: I0319 09:38:37.403811 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-r2bsk" Mar 19 09:38:37.423345 master-0 kubenswrapper[13205]: I0319 09:38:37.423277 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 09:38:37.439551 master-0 kubenswrapper[13205]: I0319 09:38:37.439489 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gfzzh" Mar 19 09:38:37.509725 master-0 kubenswrapper[13205]: I0319 09:38:37.509647 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 09:38:37.520780 master-0 kubenswrapper[13205]: I0319 09:38:37.520686 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 09:38:37.558064 master-0 kubenswrapper[13205]: I0319 09:38:37.557894 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:38:37.564096 master-0 kubenswrapper[13205]: I0319 09:38:37.564013 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 09:38:37.582186 master-0 kubenswrapper[13205]: I0319 09:38:37.582097 13205 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:38:37.588223 master-0 kubenswrapper[13205]: I0319 09:38:37.588129 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-45zpl" Mar 19 09:38:37.589901 master-0 kubenswrapper[13205]: I0319 09:38:37.589825 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:38:37.590059 master-0 kubenswrapper[13205]: I0319 09:38:37.589958 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:38:37.600505 master-0 kubenswrapper[13205]: I0319 09:38:37.600410 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:38:37.612672 master-0 kubenswrapper[13205]: I0319 09:38:37.612583 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 09:38:37.614609 master-0 kubenswrapper[13205]: I0319 09:38:37.614556 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:38:37.625832 master-0 kubenswrapper[13205]: I0319 09:38:37.625726 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=18.625702354 podStartE2EDuration="18.625702354s" podCreationTimestamp="2026-03-19 09:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:38:37.619612155 +0000 UTC m=+902.951919063" watchObservedRunningTime="2026-03-19 09:38:37.625702354 +0000 UTC m=+902.958009252" Mar 19 09:38:37.667717 master-0 kubenswrapper[13205]: I0319 09:38:37.667656 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:38:37.706417 master-0 kubenswrapper[13205]: I0319 09:38:37.706369 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:38:37.708504 master-0 kubenswrapper[13205]: I0319 09:38:37.708433 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 09:38:37.763025 master-0 kubenswrapper[13205]: I0319 09:38:37.762974 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:38:37.782916 master-0 kubenswrapper[13205]: I0319 09:38:37.782855 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:38:37.853738 master-0 kubenswrapper[13205]: I0319 09:38:37.853274 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 09:38:37.859514 master-0 kubenswrapper[13205]: I0319 09:38:37.859459 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:38:37.932882 master-0 kubenswrapper[13205]: I0319 09:38:37.932803 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 09:38:38.010933 master-0 kubenswrapper[13205]: I0319 09:38:38.010836 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:38:38.045883 master-0 kubenswrapper[13205]: I0319 09:38:38.045807 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:38:38.070129 master-0 kubenswrapper[13205]: I0319 09:38:38.070059 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 09:38:38.149914 master-0 kubenswrapper[13205]: I0319 09:38:38.149739 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 09:38:38.182805 master-0 kubenswrapper[13205]: I0319 09:38:38.182687 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-qn2w4" Mar 19 09:38:38.222100 master-0 kubenswrapper[13205]: I0319 09:38:38.222038 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 09:38:38.283731 master-0 kubenswrapper[13205]: I0319 09:38:38.280284 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 09:38:38.467234 master-0 kubenswrapper[13205]: I0319 09:38:38.467109 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:38:38.472919 master-0 kubenswrapper[13205]: I0319 09:38:38.472867 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:38:38.499279 master-0 kubenswrapper[13205]: I0319 09:38:38.499227 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-fmrxp" Mar 19 09:38:38.618812 master-0 kubenswrapper[13205]: I0319 09:38:38.618773 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:38:38.648547 master-0 kubenswrapper[13205]: I0319 09:38:38.648099 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 09:38:38.688826 master-0 kubenswrapper[13205]: I0319 09:38:38.688779 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:38:38.696255 master-0 kubenswrapper[13205]: I0319 09:38:38.695804 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 09:38:38.712669 master-0 kubenswrapper[13205]: I0319 09:38:38.712420 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 09:38:38.811120 master-0 kubenswrapper[13205]: I0319 09:38:38.809348 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:38:38.960715 master-0 kubenswrapper[13205]: I0319 09:38:38.960671 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 09:38:38.960959 master-0 kubenswrapper[13205]: I0319 09:38:38.960732 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-qttf4" Mar 19 09:38:38.960959 master-0 kubenswrapper[13205]: I0319 09:38:38.960825 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:38:38.962606 master-0 kubenswrapper[13205]: I0319 09:38:38.962585 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:38:38.975820 master-0 kubenswrapper[13205]: I0319 09:38:38.975777 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:38:38.984762 master-0 kubenswrapper[13205]: I0319 09:38:38.984690 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-2zxtq" Mar 19 09:38:39.006650 master-0 kubenswrapper[13205]: I0319 09:38:39.006595 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 09:38:39.031190 master-0 kubenswrapper[13205]: I0319 09:38:39.031145 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:38:39.045671 master-0 kubenswrapper[13205]: I0319 09:38:39.045491 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:38:39.106037 master-0 kubenswrapper[13205]: I0319 09:38:39.105910 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 09:38:39.115272 master-0 kubenswrapper[13205]: I0319 09:38:39.115222 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 09:38:39.126250 master-0 kubenswrapper[13205]: I0319 09:38:39.126210 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 09:38:39.302719 master-0 kubenswrapper[13205]: I0319 09:38:39.302673 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-88lgr" Mar 19 09:38:39.349752 master-0 kubenswrapper[13205]: I0319 09:38:39.349673 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 09:38:39.349752 master-0 kubenswrapper[13205]: I0319 09:38:39.349680 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 09:38:39.367099 master-0 kubenswrapper[13205]: I0319 09:38:39.366987 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-jvtc2" Mar 19 09:38:39.368215 master-0 kubenswrapper[13205]: I0319 09:38:39.368183 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 09:38:39.469828 master-0 kubenswrapper[13205]: I0319 09:38:39.469723 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 09:38:39.491174 master-0 kubenswrapper[13205]: I0319 09:38:39.491068 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 09:38:39.532148 master-0 kubenswrapper[13205]: I0319 09:38:39.532057 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-b9dtc" Mar 19 09:38:39.532433 master-0 kubenswrapper[13205]: I0319 09:38:39.532327 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 09:38:39.567239 master-0 kubenswrapper[13205]: I0319 09:38:39.567148 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 09:38:39.574893 master-0 kubenswrapper[13205]: I0319 09:38:39.574843 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:38:39.582859 master-0 kubenswrapper[13205]: I0319 09:38:39.582800 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:38:39.589288 master-0 kubenswrapper[13205]: I0319 09:38:39.589216 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 09:38:39.603022 master-0 kubenswrapper[13205]: I0319 09:38:39.602970 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 09:38:39.615712 master-0 kubenswrapper[13205]: I0319 09:38:39.615634 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 09:38:39.695407 master-0 kubenswrapper[13205]: I0319 09:38:39.695280 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:38:39.720360 master-0 kubenswrapper[13205]: I0319 09:38:39.720316 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 19 09:38:39.746906 master-0 kubenswrapper[13205]: I0319 09:38:39.746869 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-xh8t6" Mar 19 09:38:39.782544 master-0 kubenswrapper[13205]: I0319 09:38:39.782471 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 09:38:39.922253 master-0 kubenswrapper[13205]: I0319 09:38:39.922163 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 09:38:39.951030 master-0 kubenswrapper[13205]: I0319 09:38:39.950912 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:38:39.984281 master-0 kubenswrapper[13205]: I0319 09:38:39.984231 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 09:38:40.051267 master-0 kubenswrapper[13205]: I0319 09:38:40.051234 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-2svn2" Mar 19 09:38:40.079860 master-0 kubenswrapper[13205]: I0319 09:38:40.079810 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 09:38:40.101593 master-0 kubenswrapper[13205]: I0319 09:38:40.101553 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:38:40.178565 master-0 kubenswrapper[13205]: I0319 09:38:40.178477 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 09:38:40.199756 master-0 kubenswrapper[13205]: I0319 09:38:40.199709 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:38:40.332349 master-0 kubenswrapper[13205]: I0319 09:38:40.332296 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 09:38:40.353276 master-0 kubenswrapper[13205]: I0319 09:38:40.353223 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 09:38:40.404269 master-0 kubenswrapper[13205]: I0319 09:38:40.404219 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:38:40.415929 master-0 kubenswrapper[13205]: I0319 09:38:40.415765 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 09:38:40.461891 master-0 kubenswrapper[13205]: I0319 09:38:40.461825 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 09:38:40.507714 master-0 kubenswrapper[13205]: I0319 09:38:40.507674 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 09:38:40.518226 master-0 kubenswrapper[13205]: I0319 09:38:40.518054 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 09:38:40.555251 master-0 kubenswrapper[13205]: I0319 09:38:40.555210 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 19 09:38:40.559644 master-0 kubenswrapper[13205]: I0319 09:38:40.559584 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 09:38:40.684805 master-0 kubenswrapper[13205]: I0319 09:38:40.684694 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-tv2z8" Mar 19 09:38:40.748782 master-0 kubenswrapper[13205]: I0319 09:38:40.748698 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 09:38:40.752235 master-0 kubenswrapper[13205]: I0319 09:38:40.752198 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 09:38:40.754787 master-0 kubenswrapper[13205]: I0319 09:38:40.754735 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 09:38:40.969688 master-0 kubenswrapper[13205]: I0319 09:38:40.969412 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 09:38:41.018101 master-0 kubenswrapper[13205]: I0319 09:38:41.017918 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 09:38:41.085014 master-0 kubenswrapper[13205]: I0319 09:38:41.084965 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-mvv8v" Mar 19 09:38:41.153725 master-0 kubenswrapper[13205]: I0319 09:38:41.153659 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 09:38:41.181988 master-0 kubenswrapper[13205]: I0319 09:38:41.181911 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 09:38:41.195612 master-0 kubenswrapper[13205]: I0319 09:38:41.195555 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:38:41.264059 master-0 kubenswrapper[13205]: I0319 09:38:41.263938 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 09:38:41.265591 master-0 kubenswrapper[13205]: I0319 09:38:41.265571 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 09:38:41.327095 master-0 kubenswrapper[13205]: I0319 09:38:41.327032 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 09:38:41.327857 master-0 kubenswrapper[13205]: I0319 09:38:41.327836 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 09:38:41.335371 master-0 kubenswrapper[13205]: I0319 09:38:41.335330 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:38:41.348643 master-0 kubenswrapper[13205]: I0319 09:38:41.348584 13205 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:38:41.351678 master-0 kubenswrapper[13205]: I0319 09:38:41.351636 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-9tw96" Mar 19 09:38:41.421745 master-0 kubenswrapper[13205]: I0319 09:38:41.421686 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:38:41.524467 master-0 kubenswrapper[13205]: I0319 09:38:41.524401 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-lsnll" Mar 19 09:38:41.546579 master-0 kubenswrapper[13205]: I0319 09:38:41.546506 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-99v25" Mar 19 09:38:41.569959 master-0 kubenswrapper[13205]: I0319 09:38:41.569907 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:38:41.572694 master-0 kubenswrapper[13205]: I0319 09:38:41.572646 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 09:38:41.634068 master-0 kubenswrapper[13205]: I0319 09:38:41.633998 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 09:38:41.643657 master-0 kubenswrapper[13205]: I0319 09:38:41.643593 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 19 09:38:41.674429 master-0 kubenswrapper[13205]: I0319 09:38:41.674382 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 09:38:41.836110 master-0 kubenswrapper[13205]: I0319 09:38:41.835928 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:38:41.897454 master-0 kubenswrapper[13205]: I0319 09:38:41.897154 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:38:41.910826 master-0 kubenswrapper[13205]: I0319 09:38:41.910558 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:38:41.935238 master-0 kubenswrapper[13205]: I0319 09:38:41.934808 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:38:41.935238 master-0 kubenswrapper[13205]: I0319 09:38:41.934873 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:38:41.935238 master-0 kubenswrapper[13205]: I0319 09:38:41.935086 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="7a4744531cb137d7252790be662d8cc8" containerName="startup-monitor" containerID="cri-o://4480bed3b7a3693f08ea10613c53fbf0f22b7134b5359f687dc90373884ff91b" gracePeriod=5 Mar 19 09:38:41.993156 master-0 kubenswrapper[13205]: I0319 09:38:41.992932 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:38:42.086180 master-0 kubenswrapper[13205]: I0319 09:38:42.086075 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:38:42.103114 master-0 kubenswrapper[13205]: I0319 09:38:42.103067 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 09:38:42.155255 master-0 kubenswrapper[13205]: I0319 09:38:42.155213 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-avpd2mlhiq4t" Mar 19 09:38:42.203166 master-0 kubenswrapper[13205]: I0319 09:38:42.203116 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-sjz5s" Mar 19 09:38:42.205340 master-0 kubenswrapper[13205]: I0319 09:38:42.205319 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 09:38:42.226295 master-0 kubenswrapper[13205]: I0319 09:38:42.226199 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:38:42.241576 master-0 kubenswrapper[13205]: I0319 09:38:42.241549 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 09:38:42.286845 master-0 kubenswrapper[13205]: I0319 09:38:42.286207 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-d89qv" Mar 19 09:38:42.369017 master-0 kubenswrapper[13205]: I0319 09:38:42.368865 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-s8fs4" Mar 19 09:38:42.540512 master-0 kubenswrapper[13205]: I0319 09:38:42.540463 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 09:38:42.663272 master-0 kubenswrapper[13205]: I0319 09:38:42.662967 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:38:42.789817 master-0 kubenswrapper[13205]: I0319 09:38:42.789782 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:38:42.813034 master-0 kubenswrapper[13205]: I0319 09:38:42.812993 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:38:42.818566 master-0 kubenswrapper[13205]: I0319 09:38:42.818500 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 09:38:42.934294 master-0 kubenswrapper[13205]: I0319 09:38:42.934116 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 09:38:43.003993 master-0 kubenswrapper[13205]: I0319 09:38:43.003947 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:38:43.059554 master-0 kubenswrapper[13205]: I0319 09:38:43.059491 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 09:38:43.075231 master-0 kubenswrapper[13205]: I0319 09:38:43.075167 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 09:38:43.092359 master-0 kubenswrapper[13205]: I0319 09:38:43.092297 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 09:38:43.121394 master-0 kubenswrapper[13205]: I0319 09:38:43.121346 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:38:43.121833 master-0 kubenswrapper[13205]: I0319 09:38:43.121809 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 09:38:43.373406 master-0 kubenswrapper[13205]: I0319 09:38:43.373329 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:38:43.374273 master-0 kubenswrapper[13205]: I0319 09:38:43.373957 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 09:38:43.377783 master-0 kubenswrapper[13205]: I0319 09:38:43.377745 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 09:38:43.403035 master-0 kubenswrapper[13205]: I0319 09:38:43.402982 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:38:43.459698 master-0 kubenswrapper[13205]: I0319 09:38:43.459628 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-4wm5n" Mar 19 09:38:43.522613 master-0 kubenswrapper[13205]: I0319 09:38:43.522276 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 09:38:43.615325 master-0 kubenswrapper[13205]: I0319 09:38:43.615238 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 09:38:43.631580 master-0 kubenswrapper[13205]: I0319 09:38:43.631413 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 09:38:43.638430 master-0 kubenswrapper[13205]: I0319 09:38:43.638368 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 09:38:43.654064 master-0 kubenswrapper[13205]: I0319 09:38:43.654007 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:38:43.812495 master-0 kubenswrapper[13205]: I0319 09:38:43.812443 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:38:43.856206 master-0 kubenswrapper[13205]: I0319 09:38:43.856129 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 09:38:43.995642 master-0 kubenswrapper[13205]: I0319 09:38:43.995341 13205 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:38:44.177712 master-0 kubenswrapper[13205]: I0319 09:38:44.177578 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 09:38:44.185050 master-0 kubenswrapper[13205]: I0319 09:38:44.184986 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:38:44.274882 master-0 kubenswrapper[13205]: I0319 09:38:44.274811 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 09:38:44.278745 master-0 kubenswrapper[13205]: I0319 09:38:44.278662 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:38:44.392253 master-0 kubenswrapper[13205]: I0319 09:38:44.392179 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:38:44.438093 master-0 kubenswrapper[13205]: I0319 09:38:44.438026 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:38:44.705899 master-0 kubenswrapper[13205]: I0319 09:38:44.705738 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:38:44.723088 master-0 kubenswrapper[13205]: I0319 09:38:44.723006 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:38:44.763964 master-0 kubenswrapper[13205]: I0319 09:38:44.763821 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-224sj" Mar 19 09:38:44.837653 master-0 kubenswrapper[13205]: I0319 09:38:44.837549 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 09:38:45.224730 master-0 kubenswrapper[13205]: I0319 09:38:45.224654 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:38:45.580174 master-0 kubenswrapper[13205]: I0319 09:38:45.580095 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-252nv" Mar 19 09:38:46.469837 master-0 kubenswrapper[13205]: I0319 09:38:46.469739 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:38:46.913650 master-0 kubenswrapper[13205]: I0319 09:38:46.913518 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:38:46.914478 master-0 kubenswrapper[13205]: I0319 09:38:46.914080 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:38:46.914478 master-0 kubenswrapper[13205]: I0319 09:38:46.914107 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:38:46.932723 master-0 kubenswrapper[13205]: I0319 09:38:46.932627 13205 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:38:46.940265 master-0 kubenswrapper[13205]: I0319 09:38:46.939432 13205 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f89f4c7e-39c5-49d8-8b1c-09b49aae412f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:38:46Z\\\",\\\"message\\\":null,\\\"reason\\\":null,\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:38:46Z\\\",\\\"message\\\":null,\\\"reason\\\":null,\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ba023c4874011bc7c5c7f6971a4db9ae95e3f9a84091b1929f56c1c2eebf596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:37:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://967fa627ec262a32337d8cef24a37e02904587fa8f59830c86e09b1baf77eff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:37:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27142146c25b572f73c8b611fe230aca9b600472828b4ced3703dbf9f592e3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:37:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}]}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-master-0\": pods \"openshift-kube-scheduler-master-0\" not found" Mar 19 09:38:46.945361 master-0 kubenswrapper[13205]: I0319 09:38:46.945293 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:38:46.967463 master-0 kubenswrapper[13205]: I0319 09:38:46.967390 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:38:46.980070 master-0 kubenswrapper[13205]: I0319 09:38:46.980011 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:38:47.546599 master-0 kubenswrapper[13205]: I0319 09:38:47.546517 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_7a4744531cb137d7252790be662d8cc8/startup-monitor/0.log" Mar 19 09:38:47.546899 master-0 kubenswrapper[13205]: I0319 09:38:47.546646 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:38:47.589452 master-0 kubenswrapper[13205]: I0319 09:38:47.589180 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=1.5891495679999998 podStartE2EDuration="1.589149568s" podCreationTimestamp="2026-03-19 09:38:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:38:47.572473782 +0000 UTC m=+912.904780730" watchObservedRunningTime="2026-03-19 09:38:47.589149568 +0000 UTC m=+912.921456496" Mar 19 09:38:47.698049 master-0 kubenswrapper[13205]: I0319 09:38:47.697169 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-lock\") pod \"7a4744531cb137d7252790be662d8cc8\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " Mar 19 09:38:47.698049 master-0 kubenswrapper[13205]: I0319 09:38:47.697272 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-lock" (OuterVolumeSpecName: "var-lock") pod "7a4744531cb137d7252790be662d8cc8" (UID: "7a4744531cb137d7252790be662d8cc8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:38:47.698049 master-0 kubenswrapper[13205]: I0319 09:38:47.697317 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-log\") pod \"7a4744531cb137d7252790be662d8cc8\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " Mar 19 09:38:47.698049 master-0 kubenswrapper[13205]: I0319 09:38:47.697397 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-log" (OuterVolumeSpecName: "var-log") pod "7a4744531cb137d7252790be662d8cc8" (UID: "7a4744531cb137d7252790be662d8cc8"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:38:47.698049 master-0 kubenswrapper[13205]: I0319 09:38:47.697567 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-manifests\") pod \"7a4744531cb137d7252790be662d8cc8\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " Mar 19 09:38:47.698049 master-0 kubenswrapper[13205]: I0319 09:38:47.697618 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-pod-resource-dir\") pod \"7a4744531cb137d7252790be662d8cc8\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " Mar 19 09:38:47.698049 master-0 kubenswrapper[13205]: I0319 09:38:47.697635 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-manifests" (OuterVolumeSpecName: "manifests") pod "7a4744531cb137d7252790be662d8cc8" (UID: "7a4744531cb137d7252790be662d8cc8"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:38:47.698049 master-0 kubenswrapper[13205]: I0319 09:38:47.697772 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-resource-dir\") pod \"7a4744531cb137d7252790be662d8cc8\" (UID: \"7a4744531cb137d7252790be662d8cc8\") " Mar 19 09:38:47.698049 master-0 kubenswrapper[13205]: I0319 09:38:47.697920 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "7a4744531cb137d7252790be662d8cc8" (UID: "7a4744531cb137d7252790be662d8cc8"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:38:47.698851 master-0 kubenswrapper[13205]: I0319 09:38:47.698272 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:38:47.698851 master-0 kubenswrapper[13205]: I0319 09:38:47.698300 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:38:47.698851 master-0 kubenswrapper[13205]: I0319 09:38:47.698318 13205 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 09:38:47.698851 master-0 kubenswrapper[13205]: I0319 09:38:47.698335 13205 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 09:38:47.706009 master-0 kubenswrapper[13205]: I0319 09:38:47.705953 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "7a4744531cb137d7252790be662d8cc8" (UID: "7a4744531cb137d7252790be662d8cc8"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:38:47.709651 master-0 kubenswrapper[13205]: I0319 09:38:47.709605 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_7a4744531cb137d7252790be662d8cc8/startup-monitor/0.log" Mar 19 09:38:47.709794 master-0 kubenswrapper[13205]: I0319 09:38:47.709683 13205 generic.go:334] "Generic (PLEG): container finished" podID="7a4744531cb137d7252790be662d8cc8" containerID="4480bed3b7a3693f08ea10613c53fbf0f22b7134b5359f687dc90373884ff91b" exitCode=137 Mar 19 09:38:47.709874 master-0 kubenswrapper[13205]: I0319 09:38:47.709794 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:38:47.709969 master-0 kubenswrapper[13205]: I0319 09:38:47.709888 13205 scope.go:117] "RemoveContainer" containerID="4480bed3b7a3693f08ea10613c53fbf0f22b7134b5359f687dc90373884ff91b" Mar 19 09:38:47.710361 master-0 kubenswrapper[13205]: I0319 09:38:47.710168 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:38:47.710361 master-0 kubenswrapper[13205]: I0319 09:38:47.710199 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="f89f4c7e-39c5-49d8-8b1c-09b49aae412f" Mar 19 09:38:47.772466 master-0 kubenswrapper[13205]: I0319 09:38:47.771974 13205 scope.go:117] "RemoveContainer" containerID="4480bed3b7a3693f08ea10613c53fbf0f22b7134b5359f687dc90373884ff91b" Mar 19 09:38:47.773276 master-0 kubenswrapper[13205]: E0319 09:38:47.773216 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4480bed3b7a3693f08ea10613c53fbf0f22b7134b5359f687dc90373884ff91b\": container with ID starting with 4480bed3b7a3693f08ea10613c53fbf0f22b7134b5359f687dc90373884ff91b not found: ID does not exist" containerID="4480bed3b7a3693f08ea10613c53fbf0f22b7134b5359f687dc90373884ff91b" Mar 19 09:38:47.773363 master-0 kubenswrapper[13205]: I0319 09:38:47.773275 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4480bed3b7a3693f08ea10613c53fbf0f22b7134b5359f687dc90373884ff91b"} err="failed to get container status \"4480bed3b7a3693f08ea10613c53fbf0f22b7134b5359f687dc90373884ff91b\": rpc error: code = NotFound desc = could not find container \"4480bed3b7a3693f08ea10613c53fbf0f22b7134b5359f687dc90373884ff91b\": container with ID starting with 4480bed3b7a3693f08ea10613c53fbf0f22b7134b5359f687dc90373884ff91b not found: ID does not exist" Mar 19 09:38:47.800323 master-0 kubenswrapper[13205]: I0319 09:38:47.799949 13205 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a4744531cb137d7252790be662d8cc8-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:38:48.234628 master-0 kubenswrapper[13205]: I0319 09:38:48.234513 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-f7f86499b-nc978"] Mar 19 09:38:48.235434 master-0 kubenswrapper[13205]: E0319 09:38:48.234868 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4744531cb137d7252790be662d8cc8" containerName="startup-monitor" Mar 19 09:38:48.235434 master-0 kubenswrapper[13205]: I0319 09:38:48.234884 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4744531cb137d7252790be662d8cc8" containerName="startup-monitor" Mar 19 09:38:48.235434 master-0 kubenswrapper[13205]: E0319 09:38:48.234907 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" containerName="installer" Mar 19 09:38:48.235434 master-0 kubenswrapper[13205]: I0319 09:38:48.234916 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" containerName="installer" Mar 19 09:38:48.235434 master-0 kubenswrapper[13205]: E0319 09:38:48.234937 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" containerName="installer" Mar 19 09:38:48.235434 master-0 kubenswrapper[13205]: I0319 09:38:48.234946 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" containerName="installer" Mar 19 09:38:48.235434 master-0 kubenswrapper[13205]: I0319 09:38:48.235122 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0cf46f-a311-4083-9187-8fb45c1106dd" containerName="installer" Mar 19 09:38:48.235434 master-0 kubenswrapper[13205]: I0319 09:38:48.235142 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a4744531cb137d7252790be662d8cc8" containerName="startup-monitor" Mar 19 09:38:48.235434 master-0 kubenswrapper[13205]: I0319 09:38:48.235157 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d83e52-4097-4c66-ad8b-bd524ff59c95" containerName="installer" Mar 19 09:38:48.236415 master-0 kubenswrapper[13205]: I0319 09:38:48.236366 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.238978 master-0 kubenswrapper[13205]: I0319 09:38:48.238904 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 19 09:38:48.239159 master-0 kubenswrapper[13205]: I0319 09:38:48.239040 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 19 09:38:48.239475 master-0 kubenswrapper[13205]: I0319 09:38:48.239424 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 19 09:38:48.239908 master-0 kubenswrapper[13205]: I0319 09:38:48.239860 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 19 09:38:48.241994 master-0 kubenswrapper[13205]: I0319 09:38:48.241955 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 19 09:38:48.250003 master-0 kubenswrapper[13205]: I0319 09:38:48.249928 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 19 09:38:48.259638 master-0 kubenswrapper[13205]: I0319 09:38:48.259566 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-f7f86499b-nc978"] Mar 19 09:38:48.409330 master-0 kubenswrapper[13205]: I0319 09:38:48.409226 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-secret-telemeter-client\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.409330 master-0 kubenswrapper[13205]: I0319 09:38:48.409310 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzb7r\" (UniqueName: \"kubernetes.io/projected/5c7e8229-0d0e-4de4-9636-30bb0de815d9-kube-api-access-lzb7r\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.411324 master-0 kubenswrapper[13205]: I0319 09:38:48.410226 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.426166 master-0 kubenswrapper[13205]: I0319 09:38:48.426079 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c7e8229-0d0e-4de4-9636-30bb0de815d9-serving-certs-ca-bundle\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.426628 master-0 kubenswrapper[13205]: I0319 09:38:48.426282 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.426628 master-0 kubenswrapper[13205]: I0319 09:38:48.426363 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c7e8229-0d0e-4de4-9636-30bb0de815d9-metrics-client-ca\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.426628 master-0 kubenswrapper[13205]: I0319 09:38:48.426459 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.426961 master-0 kubenswrapper[13205]: I0319 09:38:48.426750 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-federate-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.528757 master-0 kubenswrapper[13205]: I0319 09:38:48.528685 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-federate-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.528970 master-0 kubenswrapper[13205]: I0319 09:38:48.528855 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-secret-telemeter-client\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.528970 master-0 kubenswrapper[13205]: I0319 09:38:48.528913 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzb7r\" (UniqueName: \"kubernetes.io/projected/5c7e8229-0d0e-4de4-9636-30bb0de815d9-kube-api-access-lzb7r\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.529114 master-0 kubenswrapper[13205]: I0319 09:38:48.528977 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.529114 master-0 kubenswrapper[13205]: I0319 09:38:48.529041 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c7e8229-0d0e-4de4-9636-30bb0de815d9-serving-certs-ca-bundle\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.529114 master-0 kubenswrapper[13205]: I0319 09:38:48.529108 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.529399 master-0 kubenswrapper[13205]: I0319 09:38:48.529142 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c7e8229-0d0e-4de4-9636-30bb0de815d9-metrics-client-ca\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.529399 master-0 kubenswrapper[13205]: I0319 09:38:48.529184 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.529591 master-0 kubenswrapper[13205]: E0319 09:38:48.529567 13205 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 19 09:38:48.529700 master-0 kubenswrapper[13205]: E0319 09:38:48.529639 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls podName:5c7e8229-0d0e-4de4-9636-30bb0de815d9 nodeName:}" failed. No retries permitted until 2026-03-19 09:38:49.029610818 +0000 UTC m=+914.361917706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls") pod "telemeter-client-f7f86499b-nc978" (UID: "5c7e8229-0d0e-4de4-9636-30bb0de815d9") : secret "telemeter-client-tls" not found Mar 19 09:38:48.531039 master-0 kubenswrapper[13205]: I0319 09:38:48.530946 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c7e8229-0d0e-4de4-9636-30bb0de815d9-serving-certs-ca-bundle\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.531375 master-0 kubenswrapper[13205]: I0319 09:38:48.531306 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c7e8229-0d0e-4de4-9636-30bb0de815d9-metrics-client-ca\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.533448 master-0 kubenswrapper[13205]: I0319 09:38:48.533385 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.534738 master-0 kubenswrapper[13205]: I0319 09:38:48.534662 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-federate-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.537511 master-0 kubenswrapper[13205]: I0319 09:38:48.537421 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.538706 master-0 kubenswrapper[13205]: I0319 09:38:48.538197 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-secret-telemeter-client\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.559183 master-0 kubenswrapper[13205]: I0319 09:38:48.559090 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzb7r\" (UniqueName: \"kubernetes.io/projected/5c7e8229-0d0e-4de4-9636-30bb0de815d9-kube-api-access-lzb7r\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:48.867846 master-0 kubenswrapper[13205]: I0319 09:38:48.867462 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a4744531cb137d7252790be662d8cc8" path="/var/lib/kubelet/pods/7a4744531cb137d7252790be662d8cc8/volumes" Mar 19 09:38:49.040482 master-0 kubenswrapper[13205]: I0319 09:38:49.040382 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:49.041286 master-0 kubenswrapper[13205]: E0319 09:38:49.040825 13205 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 19 09:38:49.041286 master-0 kubenswrapper[13205]: E0319 09:38:49.041087 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls podName:5c7e8229-0d0e-4de4-9636-30bb0de815d9 nodeName:}" failed. No retries permitted until 2026-03-19 09:38:50.041049936 +0000 UTC m=+915.373356874 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls") pod "telemeter-client-f7f86499b-nc978" (UID: "5c7e8229-0d0e-4de4-9636-30bb0de815d9") : secret "telemeter-client-tls" not found Mar 19 09:38:50.064175 master-0 kubenswrapper[13205]: I0319 09:38:50.064107 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:50.064740 master-0 kubenswrapper[13205]: E0319 09:38:50.064276 13205 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 19 09:38:50.064740 master-0 kubenswrapper[13205]: E0319 09:38:50.064381 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls podName:5c7e8229-0d0e-4de4-9636-30bb0de815d9 nodeName:}" failed. No retries permitted until 2026-03-19 09:38:52.06436027 +0000 UTC m=+917.396667158 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls") pod "telemeter-client-f7f86499b-nc978" (UID: "5c7e8229-0d0e-4de4-9636-30bb0de815d9") : secret "telemeter-client-tls" not found Mar 19 09:38:52.102495 master-0 kubenswrapper[13205]: I0319 09:38:52.102411 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:52.103668 master-0 kubenswrapper[13205]: E0319 09:38:52.102759 13205 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 19 09:38:52.103668 master-0 kubenswrapper[13205]: E0319 09:38:52.102893 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls podName:5c7e8229-0d0e-4de4-9636-30bb0de815d9 nodeName:}" failed. No retries permitted until 2026-03-19 09:38:56.102858771 +0000 UTC m=+921.435165689 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls") pod "telemeter-client-f7f86499b-nc978" (UID: "5c7e8229-0d0e-4de4-9636-30bb0de815d9") : secret "telemeter-client-tls" not found Mar 19 09:38:56.195194 master-0 kubenswrapper[13205]: I0319 09:38:56.195070 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:38:56.196112 master-0 kubenswrapper[13205]: E0319 09:38:56.195303 13205 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 19 09:38:56.196112 master-0 kubenswrapper[13205]: E0319 09:38:56.195451 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls podName:5c7e8229-0d0e-4de4-9636-30bb0de815d9 nodeName:}" failed. No retries permitted until 2026-03-19 09:39:04.195418893 +0000 UTC m=+929.527725811 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls") pod "telemeter-client-f7f86499b-nc978" (UID: "5c7e8229-0d0e-4de4-9636-30bb0de815d9") : secret "telemeter-client-tls" not found Mar 19 09:39:01.678201 master-0 kubenswrapper[13205]: I0319 09:39:01.678113 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 09:39:03.203738 master-0 kubenswrapper[13205]: I0319 09:39:03.203229 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 09:39:03.945762 master-0 kubenswrapper[13205]: I0319 09:39:03.945673 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:39:04.243425 master-0 kubenswrapper[13205]: I0319 09:39:04.243256 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:39:04.244231 master-0 kubenswrapper[13205]: E0319 09:39:04.243557 13205 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 19 09:39:04.244231 master-0 kubenswrapper[13205]: E0319 09:39:04.243671 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls podName:5c7e8229-0d0e-4de4-9636-30bb0de815d9 nodeName:}" failed. No retries permitted until 2026-03-19 09:39:20.24364177 +0000 UTC m=+945.575948688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls") pod "telemeter-client-f7f86499b-nc978" (UID: "5c7e8229-0d0e-4de4-9636-30bb0de815d9") : secret "telemeter-client-tls" not found Mar 19 09:39:08.286016 master-0 kubenswrapper[13205]: I0319 09:39:08.285965 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 19 09:39:08.982074 master-0 kubenswrapper[13205]: I0319 09:39:08.981952 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:39:15.049169 master-0 kubenswrapper[13205]: I0319 09:39:15.049095 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:39:17.901249 master-0 kubenswrapper[13205]: I0319 09:39:17.901173 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:39:18.861299 master-0 kubenswrapper[13205]: I0319 09:39:18.861220 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 09:39:20.327666 master-0 kubenswrapper[13205]: I0319 09:39:20.327509 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:39:20.328800 master-0 kubenswrapper[13205]: E0319 09:39:20.327701 13205 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Mar 19 09:39:20.328800 master-0 kubenswrapper[13205]: E0319 09:39:20.327799 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls podName:5c7e8229-0d0e-4de4-9636-30bb0de815d9 nodeName:}" failed. No retries permitted until 2026-03-19 09:39:52.327775134 +0000 UTC m=+977.660082062 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls") pod "telemeter-client-f7f86499b-nc978" (UID: "5c7e8229-0d0e-4de4-9636-30bb0de815d9") : secret "telemeter-client-tls" not found Mar 19 09:39:35.226683 master-0 kubenswrapper[13205]: E0319 09:39:35.226516 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:39:51.355078 master-0 kubenswrapper[13205]: I0319 09:39:51.354989 13205 scope.go:117] "RemoveContainer" containerID="234932c6aa4854708a10bfd6ff5c0b2a32a6ce550c7885888734f2d1075fb3a5" Mar 19 09:39:51.382094 master-0 kubenswrapper[13205]: I0319 09:39:51.382027 13205 scope.go:117] "RemoveContainer" containerID="4b8296c8aab85c007fe985852836d80847020fe70c583a261ac67856bf44c2bf" Mar 19 09:39:51.406427 master-0 kubenswrapper[13205]: I0319 09:39:51.406361 13205 scope.go:117] "RemoveContainer" containerID="f46b0b23ccdc4101d15fea4308a57f8af72710fa5156b459dd9c1fc3d0424ef4" Mar 19 09:39:51.427175 master-0 kubenswrapper[13205]: I0319 09:39:51.427113 13205 scope.go:117] "RemoveContainer" containerID="6586d1ef1d8b389dab4a4ad49608dcd75cf745858d90f13a98e0648bcd092731" Mar 19 09:39:51.453432 master-0 kubenswrapper[13205]: I0319 09:39:51.453349 13205 scope.go:117] "RemoveContainer" containerID="b9c3e8b758bdc9a75844b0278cf27a3810e791645794ec44f1bef75175922fcf" Mar 19 09:39:51.478323 master-0 kubenswrapper[13205]: I0319 09:39:51.478261 13205 scope.go:117] "RemoveContainer" containerID="96f501d33ba99906fcc67f343ffb6c0314c555d3c6113a843511ffa7ed7f311a" Mar 19 09:39:51.508191 master-0 kubenswrapper[13205]: I0319 09:39:51.508059 13205 scope.go:117] "RemoveContainer" containerID="826550ebd0b1d6be98355aa6c853b794d4484edfacd28e4eb43f5eadc79826f2" Mar 19 09:39:51.537268 master-0 kubenswrapper[13205]: I0319 09:39:51.537223 13205 scope.go:117] "RemoveContainer" containerID="e11c7067a7cc9283dccf50eb10db382afb4e377743f71c297da2c1fc383ce771" Mar 19 09:39:52.343242 master-0 kubenswrapper[13205]: I0319 09:39:52.343138 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:39:52.348552 master-0 kubenswrapper[13205]: I0319 09:39:52.348450 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5c7e8229-0d0e-4de4-9636-30bb0de815d9-telemeter-client-tls\") pod \"telemeter-client-f7f86499b-nc978\" (UID: \"5c7e8229-0d0e-4de4-9636-30bb0de815d9\") " pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:39:52.505276 master-0 kubenswrapper[13205]: I0319 09:39:52.505202 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" Mar 19 09:39:53.024206 master-0 kubenswrapper[13205]: I0319 09:39:53.024126 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-f7f86499b-nc978"] Mar 19 09:39:53.028882 master-0 kubenswrapper[13205]: W0319 09:39:53.028811 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c7e8229_0d0e_4de4_9636_30bb0de815d9.slice/crio-47f6a9017d5d91bccc31eba696714d62f306761be0ecba9aa0f74f8ac49d7f0c WatchSource:0}: Error finding container 47f6a9017d5d91bccc31eba696714d62f306761be0ecba9aa0f74f8ac49d7f0c: Status 404 returned error can't find the container with id 47f6a9017d5d91bccc31eba696714d62f306761be0ecba9aa0f74f8ac49d7f0c Mar 19 09:39:53.879618 master-0 kubenswrapper[13205]: I0319 09:39:53.879569 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" event={"ID":"5c7e8229-0d0e-4de4-9636-30bb0de815d9","Type":"ContainerStarted","Data":"47f6a9017d5d91bccc31eba696714d62f306761be0ecba9aa0f74f8ac49d7f0c"} Mar 19 09:39:55.902426 master-0 kubenswrapper[13205]: I0319 09:39:55.902322 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f7f86499b-nc978_5c7e8229-0d0e-4de4-9636-30bb0de815d9/telemeter-client/0.log" Mar 19 09:39:55.902426 master-0 kubenswrapper[13205]: I0319 09:39:55.902381 13205 generic.go:334] "Generic (PLEG): container finished" podID="5c7e8229-0d0e-4de4-9636-30bb0de815d9" containerID="ebaa88072656f53b4e7ae0f450f19995c96b6239a8fd2593dbc39491d562460f" exitCode=1 Mar 19 09:39:55.902938 master-0 kubenswrapper[13205]: I0319 09:39:55.902432 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" event={"ID":"5c7e8229-0d0e-4de4-9636-30bb0de815d9","Type":"ContainerStarted","Data":"00be13aff114a8ec8e34bcee0064852a5e0be3b9d699cfd6c8e910409b676068"} Mar 19 09:39:55.902938 master-0 kubenswrapper[13205]: I0319 09:39:55.902458 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" event={"ID":"5c7e8229-0d0e-4de4-9636-30bb0de815d9","Type":"ContainerStarted","Data":"fa29ac1cc817db5e6059a874706a039ac305993f265cb3697e383e783d04a296"} Mar 19 09:39:55.902938 master-0 kubenswrapper[13205]: I0319 09:39:55.902467 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" event={"ID":"5c7e8229-0d0e-4de4-9636-30bb0de815d9","Type":"ContainerDied","Data":"ebaa88072656f53b4e7ae0f450f19995c96b6239a8fd2593dbc39491d562460f"} Mar 19 09:39:55.902938 master-0 kubenswrapper[13205]: I0319 09:39:55.902902 13205 scope.go:117] "RemoveContainer" containerID="ebaa88072656f53b4e7ae0f450f19995c96b6239a8fd2593dbc39491d562460f" Mar 19 09:39:56.916267 master-0 kubenswrapper[13205]: I0319 09:39:56.916188 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f7f86499b-nc978_5c7e8229-0d0e-4de4-9636-30bb0de815d9/telemeter-client/0.log" Mar 19 09:39:56.917864 master-0 kubenswrapper[13205]: I0319 09:39:56.916285 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" event={"ID":"5c7e8229-0d0e-4de4-9636-30bb0de815d9","Type":"ContainerStarted","Data":"fd9a0cace36aada1165cac54997e9b72ec9e99fa9a8974b8dd89568302649e9f"} Mar 19 09:39:56.963961 master-0 kubenswrapper[13205]: I0319 09:39:56.963864 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-f7f86499b-nc978" podStartSLOduration=66.9063692 podStartE2EDuration="1m8.963839552s" podCreationTimestamp="2026-03-19 09:38:48 +0000 UTC" firstStartedPulling="2026-03-19 09:39:53.032374934 +0000 UTC m=+978.364681862" lastFinishedPulling="2026-03-19 09:39:55.089845326 +0000 UTC m=+980.422152214" observedRunningTime="2026-03-19 09:39:56.950873378 +0000 UTC m=+982.283180276" watchObservedRunningTime="2026-03-19 09:39:56.963839552 +0000 UTC m=+982.296146450" Mar 19 09:39:57.562575 master-0 kubenswrapper[13205]: I0319 09:39:57.562495 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7444767f68-s8sv9"] Mar 19 09:39:57.563418 master-0 kubenswrapper[13205]: I0319 09:39:57.563386 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.576492 master-0 kubenswrapper[13205]: I0319 09:39:57.576443 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7444767f68-s8sv9"] Mar 19 09:39:57.753118 master-0 kubenswrapper[13205]: I0319 09:39:57.753050 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-service-ca\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.753118 master-0 kubenswrapper[13205]: I0319 09:39:57.753106 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de2355a9-4fce-456b-b344-737e9f96d24c-console-oauth-config\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.753599 master-0 kubenswrapper[13205]: I0319 09:39:57.753493 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-oauth-serving-cert\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.753737 master-0 kubenswrapper[13205]: I0319 09:39:57.753638 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xd8b\" (UniqueName: \"kubernetes.io/projected/de2355a9-4fce-456b-b344-737e9f96d24c-kube-api-access-6xd8b\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.754308 master-0 kubenswrapper[13205]: I0319 09:39:57.753926 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-console-config\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.754308 master-0 kubenswrapper[13205]: I0319 09:39:57.754035 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-trusted-ca-bundle\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.754308 master-0 kubenswrapper[13205]: I0319 09:39:57.754112 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de2355a9-4fce-456b-b344-737e9f96d24c-console-serving-cert\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.856021 master-0 kubenswrapper[13205]: I0319 09:39:57.855859 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-service-ca\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.856507 master-0 kubenswrapper[13205]: I0319 09:39:57.856465 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de2355a9-4fce-456b-b344-737e9f96d24c-console-oauth-config\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.856721 master-0 kubenswrapper[13205]: I0319 09:39:57.856518 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-oauth-serving-cert\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.856803 master-0 kubenswrapper[13205]: I0319 09:39:57.856727 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xd8b\" (UniqueName: \"kubernetes.io/projected/de2355a9-4fce-456b-b344-737e9f96d24c-kube-api-access-6xd8b\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.856967 master-0 kubenswrapper[13205]: I0319 09:39:57.856918 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-console-config\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.856967 master-0 kubenswrapper[13205]: I0319 09:39:57.856962 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-trusted-ca-bundle\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.857256 master-0 kubenswrapper[13205]: I0319 09:39:57.857017 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de2355a9-4fce-456b-b344-737e9f96d24c-console-serving-cert\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.857782 master-0 kubenswrapper[13205]: I0319 09:39:57.857739 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-oauth-serving-cert\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.858386 master-0 kubenswrapper[13205]: I0319 09:39:57.858326 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-console-config\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.859153 master-0 kubenswrapper[13205]: I0319 09:39:57.859093 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-trusted-ca-bundle\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.859687 master-0 kubenswrapper[13205]: I0319 09:39:57.859639 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-service-ca\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.861163 master-0 kubenswrapper[13205]: I0319 09:39:57.861113 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de2355a9-4fce-456b-b344-737e9f96d24c-console-oauth-config\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.862382 master-0 kubenswrapper[13205]: I0319 09:39:57.861650 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de2355a9-4fce-456b-b344-737e9f96d24c-console-serving-cert\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.879313 master-0 kubenswrapper[13205]: I0319 09:39:57.879266 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xd8b\" (UniqueName: \"kubernetes.io/projected/de2355a9-4fce-456b-b344-737e9f96d24c-kube-api-access-6xd8b\") pod \"console-7444767f68-s8sv9\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:57.886610 master-0 kubenswrapper[13205]: I0319 09:39:57.886568 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:39:58.287563 master-0 kubenswrapper[13205]: I0319 09:39:58.287505 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7444767f68-s8sv9"] Mar 19 09:39:58.297101 master-0 kubenswrapper[13205]: W0319 09:39:58.297071 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde2355a9_4fce_456b_b344_737e9f96d24c.slice/crio-e70367d9be06d99c372bf12d68a55591b18bad34df8c2cc5f657d86fa4f132f0 WatchSource:0}: Error finding container e70367d9be06d99c372bf12d68a55591b18bad34df8c2cc5f657d86fa4f132f0: Status 404 returned error can't find the container with id e70367d9be06d99c372bf12d68a55591b18bad34df8c2cc5f657d86fa4f132f0 Mar 19 09:39:58.485217 master-0 kubenswrapper[13205]: I0319 09:39:58.485160 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7444767f68-s8sv9"] Mar 19 09:39:58.539447 master-0 kubenswrapper[13205]: I0319 09:39:58.539383 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-fbc566cb4-69hmp"] Mar 19 09:39:58.540309 master-0 kubenswrapper[13205]: I0319 09:39:58.540283 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.567723 master-0 kubenswrapper[13205]: I0319 09:39:58.567669 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fbc566cb4-69hmp"] Mar 19 09:39:58.670960 master-0 kubenswrapper[13205]: I0319 09:39:58.670909 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-oauth-serving-cert\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.671252 master-0 kubenswrapper[13205]: I0319 09:39:58.671233 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ncfp\" (UniqueName: \"kubernetes.io/projected/91c4bb91-752a-42c9-bc52-61bd8e935269-kube-api-access-4ncfp\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.671380 master-0 kubenswrapper[13205]: I0319 09:39:58.671366 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-service-ca\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.671511 master-0 kubenswrapper[13205]: I0319 09:39:58.671498 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91c4bb91-752a-42c9-bc52-61bd8e935269-console-oauth-config\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.671638 master-0 kubenswrapper[13205]: I0319 09:39:58.671626 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-console-config\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.671746 master-0 kubenswrapper[13205]: I0319 09:39:58.671729 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-trusted-ca-bundle\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.671913 master-0 kubenswrapper[13205]: I0319 09:39:58.671901 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91c4bb91-752a-42c9-bc52-61bd8e935269-console-serving-cert\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.774204 master-0 kubenswrapper[13205]: I0319 09:39:58.774100 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91c4bb91-752a-42c9-bc52-61bd8e935269-console-serving-cert\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.776518 master-0 kubenswrapper[13205]: I0319 09:39:58.776474 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-oauth-serving-cert\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.776876 master-0 kubenswrapper[13205]: I0319 09:39:58.774703 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-oauth-serving-cert\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.777150 master-0 kubenswrapper[13205]: I0319 09:39:58.777113 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ncfp\" (UniqueName: \"kubernetes.io/projected/91c4bb91-752a-42c9-bc52-61bd8e935269-kube-api-access-4ncfp\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.777399 master-0 kubenswrapper[13205]: I0319 09:39:58.777365 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-service-ca\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.777713 master-0 kubenswrapper[13205]: I0319 09:39:58.777661 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91c4bb91-752a-42c9-bc52-61bd8e935269-console-serving-cert\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.777904 master-0 kubenswrapper[13205]: I0319 09:39:58.777867 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91c4bb91-752a-42c9-bc52-61bd8e935269-console-oauth-config\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.778249 master-0 kubenswrapper[13205]: I0319 09:39:58.778193 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-console-config\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.778493 master-0 kubenswrapper[13205]: I0319 09:39:58.778444 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-service-ca\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.778829 master-0 kubenswrapper[13205]: I0319 09:39:58.778460 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-trusted-ca-bundle\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.779261 master-0 kubenswrapper[13205]: I0319 09:39:58.779203 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-console-config\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.781137 master-0 kubenswrapper[13205]: I0319 09:39:58.781091 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91c4bb91-752a-42c9-bc52-61bd8e935269-console-oauth-config\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.782947 master-0 kubenswrapper[13205]: I0319 09:39:58.782901 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-trusted-ca-bundle\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.798303 master-0 kubenswrapper[13205]: I0319 09:39:58.798174 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ncfp\" (UniqueName: \"kubernetes.io/projected/91c4bb91-752a-42c9-bc52-61bd8e935269-kube-api-access-4ncfp\") pod \"console-fbc566cb4-69hmp\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.860976 master-0 kubenswrapper[13205]: I0319 09:39:58.860276 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:39:58.971081 master-0 kubenswrapper[13205]: I0319 09:39:58.970999 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7444767f68-s8sv9" event={"ID":"de2355a9-4fce-456b-b344-737e9f96d24c","Type":"ContainerStarted","Data":"b5ba5edf74ff9415297f4981f39a429f20575244336852fb95623670d1f2534f"} Mar 19 09:39:58.971081 master-0 kubenswrapper[13205]: I0319 09:39:58.971069 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7444767f68-s8sv9" event={"ID":"de2355a9-4fce-456b-b344-737e9f96d24c","Type":"ContainerStarted","Data":"e70367d9be06d99c372bf12d68a55591b18bad34df8c2cc5f657d86fa4f132f0"} Mar 19 09:39:59.208838 master-0 kubenswrapper[13205]: I0319 09:39:59.208419 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7444767f68-s8sv9" podStartSLOduration=2.208400266 podStartE2EDuration="2.208400266s" podCreationTimestamp="2026-03-19 09:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:39:59.006067284 +0000 UTC m=+984.338374192" watchObservedRunningTime="2026-03-19 09:39:59.208400266 +0000 UTC m=+984.540707164" Mar 19 09:39:59.216550 master-0 kubenswrapper[13205]: I0319 09:39:59.216462 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fbc566cb4-69hmp"] Mar 19 09:39:59.251114 master-0 kubenswrapper[13205]: I0319 09:39:59.251060 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-b9f6c6556-vscqq"] Mar 19 09:39:59.251842 master-0 kubenswrapper[13205]: I0319 09:39:59.251822 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.276545 master-0 kubenswrapper[13205]: I0319 09:39:59.276246 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b9f6c6556-vscqq"] Mar 19 09:39:59.328975 master-0 kubenswrapper[13205]: I0319 09:39:59.328874 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fbc566cb4-69hmp"] Mar 19 09:39:59.390248 master-0 kubenswrapper[13205]: I0319 09:39:59.390177 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-service-ca\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.390352 master-0 kubenswrapper[13205]: I0319 09:39:59.390263 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-oauth-config\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.390553 master-0 kubenswrapper[13205]: I0319 09:39:59.390500 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b77rk\" (UniqueName: \"kubernetes.io/projected/78e6748d-be2a-4245-8a1e-567e1d7aa398-kube-api-access-b77rk\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.390617 master-0 kubenswrapper[13205]: I0319 09:39:59.390591 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-config\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.390656 master-0 kubenswrapper[13205]: I0319 09:39:59.390616 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-oauth-serving-cert\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.390791 master-0 kubenswrapper[13205]: I0319 09:39:59.390745 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-serving-cert\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.390847 master-0 kubenswrapper[13205]: I0319 09:39:59.390824 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-trusted-ca-bundle\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.492837 master-0 kubenswrapper[13205]: I0319 09:39:59.492766 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-oauth-config\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.492974 master-0 kubenswrapper[13205]: I0319 09:39:59.492895 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b77rk\" (UniqueName: \"kubernetes.io/projected/78e6748d-be2a-4245-8a1e-567e1d7aa398-kube-api-access-b77rk\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.492974 master-0 kubenswrapper[13205]: I0319 09:39:59.492936 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-config\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.493068 master-0 kubenswrapper[13205]: I0319 09:39:59.492969 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-oauth-serving-cert\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.493068 master-0 kubenswrapper[13205]: I0319 09:39:59.493034 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-serving-cert\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.493130 master-0 kubenswrapper[13205]: I0319 09:39:59.493085 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-trusted-ca-bundle\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.493340 master-0 kubenswrapper[13205]: I0319 09:39:59.493304 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-service-ca\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.493989 master-0 kubenswrapper[13205]: I0319 09:39:59.493946 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-config\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.494194 master-0 kubenswrapper[13205]: I0319 09:39:59.494164 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-trusted-ca-bundle\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.494585 master-0 kubenswrapper[13205]: I0319 09:39:59.494513 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-oauth-serving-cert\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.494829 master-0 kubenswrapper[13205]: I0319 09:39:59.494784 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-service-ca\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.500805 master-0 kubenswrapper[13205]: I0319 09:39:59.499170 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-oauth-config\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.510545 master-0 kubenswrapper[13205]: I0319 09:39:59.508202 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-serving-cert\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.528684 master-0 kubenswrapper[13205]: I0319 09:39:59.528620 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b77rk\" (UniqueName: \"kubernetes.io/projected/78e6748d-be2a-4245-8a1e-567e1d7aa398-kube-api-access-b77rk\") pod \"console-b9f6c6556-vscqq\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.590727 master-0 kubenswrapper[13205]: I0319 09:39:59.590665 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:39:59.996767 master-0 kubenswrapper[13205]: I0319 09:39:59.991391 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fbc566cb4-69hmp" event={"ID":"91c4bb91-752a-42c9-bc52-61bd8e935269","Type":"ContainerStarted","Data":"77924f4a707facf578502f26f206a16ab0771cef3969262568d80c2e52c625d7"} Mar 19 09:39:59.996767 master-0 kubenswrapper[13205]: I0319 09:39:59.991638 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fbc566cb4-69hmp" event={"ID":"91c4bb91-752a-42c9-bc52-61bd8e935269","Type":"ContainerStarted","Data":"1cb075b1d3cdcfa9c2f25ad9ef3d5b630182e8d973f6570ecbc3f52075c2302f"} Mar 19 09:40:00.033433 master-0 kubenswrapper[13205]: I0319 09:40:00.032678 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-fbc566cb4-69hmp" podStartSLOduration=2.032654088 podStartE2EDuration="2.032654088s" podCreationTimestamp="2026-03-19 09:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:40:00.020225726 +0000 UTC m=+985.352532614" watchObservedRunningTime="2026-03-19 09:40:00.032654088 +0000 UTC m=+985.364960986" Mar 19 09:40:00.041636 master-0 kubenswrapper[13205]: I0319 09:40:00.041582 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b9f6c6556-vscqq"] Mar 19 09:40:01.005436 master-0 kubenswrapper[13205]: I0319 09:40:01.005323 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b9f6c6556-vscqq" event={"ID":"78e6748d-be2a-4245-8a1e-567e1d7aa398","Type":"ContainerStarted","Data":"66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24"} Mar 19 09:40:01.005436 master-0 kubenswrapper[13205]: I0319 09:40:01.005431 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b9f6c6556-vscqq" event={"ID":"78e6748d-be2a-4245-8a1e-567e1d7aa398","Type":"ContainerStarted","Data":"eec2b36a43be39dc2f660e26c0b9b8f2ee01a436e6bd29f57aa2cdf5ccf63113"} Mar 19 09:40:01.034983 master-0 kubenswrapper[13205]: I0319 09:40:01.034884 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b9f6c6556-vscqq" podStartSLOduration=2.034859438 podStartE2EDuration="2.034859438s" podCreationTimestamp="2026-03-19 09:39:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:40:01.032612413 +0000 UTC m=+986.364919331" watchObservedRunningTime="2026-03-19 09:40:01.034859438 +0000 UTC m=+986.367166356" Mar 19 09:40:07.887594 master-0 kubenswrapper[13205]: I0319 09:40:07.887513 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:40:08.860467 master-0 kubenswrapper[13205]: I0319 09:40:08.860398 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:40:09.591011 master-0 kubenswrapper[13205]: I0319 09:40:09.590934 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:40:09.591011 master-0 kubenswrapper[13205]: I0319 09:40:09.591011 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:40:09.598655 master-0 kubenswrapper[13205]: I0319 09:40:09.598601 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:40:10.116217 master-0 kubenswrapper[13205]: I0319 09:40:10.116126 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:40:10.197839 master-0 kubenswrapper[13205]: I0319 09:40:10.197775 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79f67cdc89-bx72w"] Mar 19 09:40:14.036391 master-0 kubenswrapper[13205]: I0319 09:40:14.036309 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-6-retry-1-master-0"] Mar 19 09:40:14.037747 master-0 kubenswrapper[13205]: I0319 09:40:14.037708 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:14.042071 master-0 kubenswrapper[13205]: I0319 09:40:14.041934 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-nm2j7" Mar 19 09:40:14.042220 master-0 kubenswrapper[13205]: I0319 09:40:14.042168 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 09:40:14.054353 master-0 kubenswrapper[13205]: I0319 09:40:14.054298 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-6-retry-1-master-0"] Mar 19 09:40:14.160562 master-0 kubenswrapper[13205]: I0319 09:40:14.154388 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-kubelet-dir\") pod \"installer-6-retry-1-master-0\" (UID: \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\") " pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:14.160562 master-0 kubenswrapper[13205]: I0319 09:40:14.154444 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-kube-api-access\") pod \"installer-6-retry-1-master-0\" (UID: \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\") " pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:14.160562 master-0 kubenswrapper[13205]: I0319 09:40:14.154483 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-var-lock\") pod \"installer-6-retry-1-master-0\" (UID: \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\") " pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:14.256186 master-0 kubenswrapper[13205]: I0319 09:40:14.256029 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-kubelet-dir\") pod \"installer-6-retry-1-master-0\" (UID: \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\") " pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:14.256186 master-0 kubenswrapper[13205]: I0319 09:40:14.256095 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-kube-api-access\") pod \"installer-6-retry-1-master-0\" (UID: \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\") " pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:14.256186 master-0 kubenswrapper[13205]: I0319 09:40:14.256126 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-var-lock\") pod \"installer-6-retry-1-master-0\" (UID: \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\") " pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:14.256430 master-0 kubenswrapper[13205]: I0319 09:40:14.256212 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-kubelet-dir\") pod \"installer-6-retry-1-master-0\" (UID: \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\") " pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:14.256430 master-0 kubenswrapper[13205]: I0319 09:40:14.256226 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-var-lock\") pod \"installer-6-retry-1-master-0\" (UID: \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\") " pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:14.275204 master-0 kubenswrapper[13205]: I0319 09:40:14.275156 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-kube-api-access\") pod \"installer-6-retry-1-master-0\" (UID: \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\") " pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:14.371072 master-0 kubenswrapper[13205]: I0319 09:40:14.370972 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:14.797043 master-0 kubenswrapper[13205]: W0319 09:40:14.796995 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod13594ce3_7087_4af3_85eb_6c50b9e2bfd2.slice/crio-9250eba53f1f68765a2fb388f1c635984bdc78050afad2f9b26969757af7d5ae WatchSource:0}: Error finding container 9250eba53f1f68765a2fb388f1c635984bdc78050afad2f9b26969757af7d5ae: Status 404 returned error can't find the container with id 9250eba53f1f68765a2fb388f1c635984bdc78050afad2f9b26969757af7d5ae Mar 19 09:40:14.797166 master-0 kubenswrapper[13205]: I0319 09:40:14.797065 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-6-retry-1-master-0"] Mar 19 09:40:15.177675 master-0 kubenswrapper[13205]: I0319 09:40:15.173314 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" event={"ID":"13594ce3-7087-4af3-85eb-6c50b9e2bfd2","Type":"ContainerStarted","Data":"486d73682332891c32f16ccd66225151ea462ed169bd44a16ebd688d0a60ebee"} Mar 19 09:40:15.177675 master-0 kubenswrapper[13205]: I0319 09:40:15.173369 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" event={"ID":"13594ce3-7087-4af3-85eb-6c50b9e2bfd2","Type":"ContainerStarted","Data":"9250eba53f1f68765a2fb388f1c635984bdc78050afad2f9b26969757af7d5ae"} Mar 19 09:40:15.201340 master-0 kubenswrapper[13205]: I0319 09:40:15.201213 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" podStartSLOduration=1.201188889 podStartE2EDuration="1.201188889s" podCreationTimestamp="2026-03-19 09:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:40:15.194255731 +0000 UTC m=+1000.526562619" watchObservedRunningTime="2026-03-19 09:40:15.201188889 +0000 UTC m=+1000.533495817" Mar 19 09:40:24.046272 master-0 kubenswrapper[13205]: I0319 09:40:24.046137 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7444767f68-s8sv9" podUID="de2355a9-4fce-456b-b344-737e9f96d24c" containerName="console" containerID="cri-o://b5ba5edf74ff9415297f4981f39a429f20575244336852fb95623670d1f2534f" gracePeriod=15 Mar 19 09:40:24.248330 master-0 kubenswrapper[13205]: I0319 09:40:24.248265 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7444767f68-s8sv9_de2355a9-4fce-456b-b344-737e9f96d24c/console/0.log" Mar 19 09:40:24.248330 master-0 kubenswrapper[13205]: I0319 09:40:24.248320 13205 generic.go:334] "Generic (PLEG): container finished" podID="de2355a9-4fce-456b-b344-737e9f96d24c" containerID="b5ba5edf74ff9415297f4981f39a429f20575244336852fb95623670d1f2534f" exitCode=2 Mar 19 09:40:24.248596 master-0 kubenswrapper[13205]: I0319 09:40:24.248351 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7444767f68-s8sv9" event={"ID":"de2355a9-4fce-456b-b344-737e9f96d24c","Type":"ContainerDied","Data":"b5ba5edf74ff9415297f4981f39a429f20575244336852fb95623670d1f2534f"} Mar 19 09:40:24.500175 master-0 kubenswrapper[13205]: I0319 09:40:24.500073 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7444767f68-s8sv9_de2355a9-4fce-456b-b344-737e9f96d24c/console/0.log" Mar 19 09:40:24.500175 master-0 kubenswrapper[13205]: I0319 09:40:24.500176 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:40:24.649764 master-0 kubenswrapper[13205]: I0319 09:40:24.649685 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-console-config\") pod \"de2355a9-4fce-456b-b344-737e9f96d24c\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " Mar 19 09:40:24.650000 master-0 kubenswrapper[13205]: I0319 09:40:24.649846 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de2355a9-4fce-456b-b344-737e9f96d24c-console-serving-cert\") pod \"de2355a9-4fce-456b-b344-737e9f96d24c\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " Mar 19 09:40:24.650211 master-0 kubenswrapper[13205]: I0319 09:40:24.650138 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-console-config" (OuterVolumeSpecName: "console-config") pod "de2355a9-4fce-456b-b344-737e9f96d24c" (UID: "de2355a9-4fce-456b-b344-737e9f96d24c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:24.650739 master-0 kubenswrapper[13205]: I0319 09:40:24.650679 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-trusted-ca-bundle\") pod \"de2355a9-4fce-456b-b344-737e9f96d24c\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " Mar 19 09:40:24.650886 master-0 kubenswrapper[13205]: I0319 09:40:24.650846 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de2355a9-4fce-456b-b344-737e9f96d24c-console-oauth-config\") pod \"de2355a9-4fce-456b-b344-737e9f96d24c\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " Mar 19 09:40:24.650941 master-0 kubenswrapper[13205]: I0319 09:40:24.650918 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-service-ca\") pod \"de2355a9-4fce-456b-b344-737e9f96d24c\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " Mar 19 09:40:24.651073 master-0 kubenswrapper[13205]: I0319 09:40:24.651013 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xd8b\" (UniqueName: \"kubernetes.io/projected/de2355a9-4fce-456b-b344-737e9f96d24c-kube-api-access-6xd8b\") pod \"de2355a9-4fce-456b-b344-737e9f96d24c\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " Mar 19 09:40:24.651130 master-0 kubenswrapper[13205]: I0319 09:40:24.651095 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-oauth-serving-cert\") pod \"de2355a9-4fce-456b-b344-737e9f96d24c\" (UID: \"de2355a9-4fce-456b-b344-737e9f96d24c\") " Mar 19 09:40:24.651622 master-0 kubenswrapper[13205]: I0319 09:40:24.651592 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-service-ca" (OuterVolumeSpecName: "service-ca") pod "de2355a9-4fce-456b-b344-737e9f96d24c" (UID: "de2355a9-4fce-456b-b344-737e9f96d24c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:24.651707 master-0 kubenswrapper[13205]: I0319 09:40:24.651676 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "de2355a9-4fce-456b-b344-737e9f96d24c" (UID: "de2355a9-4fce-456b-b344-737e9f96d24c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:24.651796 master-0 kubenswrapper[13205]: I0319 09:40:24.651773 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "de2355a9-4fce-456b-b344-737e9f96d24c" (UID: "de2355a9-4fce-456b-b344-737e9f96d24c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:24.651972 master-0 kubenswrapper[13205]: I0319 09:40:24.651937 13205 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:24.651972 master-0 kubenswrapper[13205]: I0319 09:40:24.651961 13205 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:24.651972 master-0 kubenswrapper[13205]: I0319 09:40:24.651973 13205 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:24.652110 master-0 kubenswrapper[13205]: I0319 09:40:24.651983 13205 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de2355a9-4fce-456b-b344-737e9f96d24c-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:24.654898 master-0 kubenswrapper[13205]: I0319 09:40:24.654836 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2355a9-4fce-456b-b344-737e9f96d24c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "de2355a9-4fce-456b-b344-737e9f96d24c" (UID: "de2355a9-4fce-456b-b344-737e9f96d24c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:40:24.656727 master-0 kubenswrapper[13205]: I0319 09:40:24.656678 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2355a9-4fce-456b-b344-737e9f96d24c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "de2355a9-4fce-456b-b344-737e9f96d24c" (UID: "de2355a9-4fce-456b-b344-737e9f96d24c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:40:24.657105 master-0 kubenswrapper[13205]: I0319 09:40:24.657057 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2355a9-4fce-456b-b344-737e9f96d24c-kube-api-access-6xd8b" (OuterVolumeSpecName: "kube-api-access-6xd8b") pod "de2355a9-4fce-456b-b344-737e9f96d24c" (UID: "de2355a9-4fce-456b-b344-737e9f96d24c"). InnerVolumeSpecName "kube-api-access-6xd8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:40:24.753781 master-0 kubenswrapper[13205]: I0319 09:40:24.753673 13205 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de2355a9-4fce-456b-b344-737e9f96d24c-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:24.753781 master-0 kubenswrapper[13205]: I0319 09:40:24.753741 13205 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de2355a9-4fce-456b-b344-737e9f96d24c-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:24.753781 master-0 kubenswrapper[13205]: I0319 09:40:24.753754 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xd8b\" (UniqueName: \"kubernetes.io/projected/de2355a9-4fce-456b-b344-737e9f96d24c-kube-api-access-6xd8b\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:25.031793 master-0 kubenswrapper[13205]: I0319 09:40:25.031499 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-fbc566cb4-69hmp" podUID="91c4bb91-752a-42c9-bc52-61bd8e935269" containerName="console" containerID="cri-o://77924f4a707facf578502f26f206a16ab0771cef3969262568d80c2e52c625d7" gracePeriod=15 Mar 19 09:40:25.267654 master-0 kubenswrapper[13205]: I0319 09:40:25.267597 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7444767f68-s8sv9_de2355a9-4fce-456b-b344-737e9f96d24c/console/0.log" Mar 19 09:40:25.268240 master-0 kubenswrapper[13205]: I0319 09:40:25.267747 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7444767f68-s8sv9" event={"ID":"de2355a9-4fce-456b-b344-737e9f96d24c","Type":"ContainerDied","Data":"e70367d9be06d99c372bf12d68a55591b18bad34df8c2cc5f657d86fa4f132f0"} Mar 19 09:40:25.268240 master-0 kubenswrapper[13205]: I0319 09:40:25.267799 13205 scope.go:117] "RemoveContainer" containerID="b5ba5edf74ff9415297f4981f39a429f20575244336852fb95623670d1f2534f" Mar 19 09:40:25.268240 master-0 kubenswrapper[13205]: I0319 09:40:25.267799 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7444767f68-s8sv9" Mar 19 09:40:25.270821 master-0 kubenswrapper[13205]: I0319 09:40:25.270783 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fbc566cb4-69hmp_91c4bb91-752a-42c9-bc52-61bd8e935269/console/0.log" Mar 19 09:40:25.270933 master-0 kubenswrapper[13205]: I0319 09:40:25.270871 13205 generic.go:334] "Generic (PLEG): container finished" podID="91c4bb91-752a-42c9-bc52-61bd8e935269" containerID="77924f4a707facf578502f26f206a16ab0771cef3969262568d80c2e52c625d7" exitCode=2 Mar 19 09:40:25.270933 master-0 kubenswrapper[13205]: I0319 09:40:25.270917 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fbc566cb4-69hmp" event={"ID":"91c4bb91-752a-42c9-bc52-61bd8e935269","Type":"ContainerDied","Data":"77924f4a707facf578502f26f206a16ab0771cef3969262568d80c2e52c625d7"} Mar 19 09:40:25.312592 master-0 kubenswrapper[13205]: I0319 09:40:25.312336 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7444767f68-s8sv9"] Mar 19 09:40:25.323780 master-0 kubenswrapper[13205]: I0319 09:40:25.323709 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7444767f68-s8sv9"] Mar 19 09:40:25.498945 master-0 kubenswrapper[13205]: I0319 09:40:25.498897 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fbc566cb4-69hmp_91c4bb91-752a-42c9-bc52-61bd8e935269/console/0.log" Mar 19 09:40:25.499140 master-0 kubenswrapper[13205]: I0319 09:40:25.498968 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:40:25.666906 master-0 kubenswrapper[13205]: I0319 09:40:25.666633 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91c4bb91-752a-42c9-bc52-61bd8e935269-console-oauth-config\") pod \"91c4bb91-752a-42c9-bc52-61bd8e935269\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " Mar 19 09:40:25.666906 master-0 kubenswrapper[13205]: I0319 09:40:25.666741 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ncfp\" (UniqueName: \"kubernetes.io/projected/91c4bb91-752a-42c9-bc52-61bd8e935269-kube-api-access-4ncfp\") pod \"91c4bb91-752a-42c9-bc52-61bd8e935269\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " Mar 19 09:40:25.666906 master-0 kubenswrapper[13205]: I0319 09:40:25.666813 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-oauth-serving-cert\") pod \"91c4bb91-752a-42c9-bc52-61bd8e935269\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " Mar 19 09:40:25.666906 master-0 kubenswrapper[13205]: I0319 09:40:25.666844 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-trusted-ca-bundle\") pod \"91c4bb91-752a-42c9-bc52-61bd8e935269\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " Mar 19 09:40:25.667477 master-0 kubenswrapper[13205]: I0319 09:40:25.666994 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-console-config\") pod \"91c4bb91-752a-42c9-bc52-61bd8e935269\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " Mar 19 09:40:25.667608 master-0 kubenswrapper[13205]: I0319 09:40:25.667431 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "91c4bb91-752a-42c9-bc52-61bd8e935269" (UID: "91c4bb91-752a-42c9-bc52-61bd8e935269"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:25.667608 master-0 kubenswrapper[13205]: I0319 09:40:25.667586 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91c4bb91-752a-42c9-bc52-61bd8e935269-console-serving-cert\") pod \"91c4bb91-752a-42c9-bc52-61bd8e935269\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " Mar 19 09:40:25.668024 master-0 kubenswrapper[13205]: I0319 09:40:25.667928 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "91c4bb91-752a-42c9-bc52-61bd8e935269" (UID: "91c4bb91-752a-42c9-bc52-61bd8e935269"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:25.668217 master-0 kubenswrapper[13205]: I0319 09:40:25.668049 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-service-ca\") pod \"91c4bb91-752a-42c9-bc52-61bd8e935269\" (UID: \"91c4bb91-752a-42c9-bc52-61bd8e935269\") " Mar 19 09:40:25.668217 master-0 kubenswrapper[13205]: I0319 09:40:25.668048 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-console-config" (OuterVolumeSpecName: "console-config") pod "91c4bb91-752a-42c9-bc52-61bd8e935269" (UID: "91c4bb91-752a-42c9-bc52-61bd8e935269"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:25.668574 master-0 kubenswrapper[13205]: I0319 09:40:25.668502 13205 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:25.668574 master-0 kubenswrapper[13205]: I0319 09:40:25.668541 13205 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:25.668574 master-0 kubenswrapper[13205]: I0319 09:40:25.668555 13205 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:25.668871 master-0 kubenswrapper[13205]: I0319 09:40:25.668758 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-service-ca" (OuterVolumeSpecName: "service-ca") pod "91c4bb91-752a-42c9-bc52-61bd8e935269" (UID: "91c4bb91-752a-42c9-bc52-61bd8e935269"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:25.670668 master-0 kubenswrapper[13205]: I0319 09:40:25.670613 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c4bb91-752a-42c9-bc52-61bd8e935269-kube-api-access-4ncfp" (OuterVolumeSpecName: "kube-api-access-4ncfp") pod "91c4bb91-752a-42c9-bc52-61bd8e935269" (UID: "91c4bb91-752a-42c9-bc52-61bd8e935269"). InnerVolumeSpecName "kube-api-access-4ncfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:40:25.670668 master-0 kubenswrapper[13205]: I0319 09:40:25.670622 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c4bb91-752a-42c9-bc52-61bd8e935269-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "91c4bb91-752a-42c9-bc52-61bd8e935269" (UID: "91c4bb91-752a-42c9-bc52-61bd8e935269"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:40:25.671629 master-0 kubenswrapper[13205]: I0319 09:40:25.671584 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c4bb91-752a-42c9-bc52-61bd8e935269-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "91c4bb91-752a-42c9-bc52-61bd8e935269" (UID: "91c4bb91-752a-42c9-bc52-61bd8e935269"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:40:25.769770 master-0 kubenswrapper[13205]: I0319 09:40:25.769718 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ncfp\" (UniqueName: \"kubernetes.io/projected/91c4bb91-752a-42c9-bc52-61bd8e935269-kube-api-access-4ncfp\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:25.769770 master-0 kubenswrapper[13205]: I0319 09:40:25.769756 13205 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91c4bb91-752a-42c9-bc52-61bd8e935269-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:25.769770 master-0 kubenswrapper[13205]: I0319 09:40:25.769770 13205 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91c4bb91-752a-42c9-bc52-61bd8e935269-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:25.769770 master-0 kubenswrapper[13205]: I0319 09:40:25.769782 13205 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91c4bb91-752a-42c9-bc52-61bd8e935269-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:26.285458 master-0 kubenswrapper[13205]: I0319 09:40:26.285369 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fbc566cb4-69hmp_91c4bb91-752a-42c9-bc52-61bd8e935269/console/0.log" Mar 19 09:40:26.286437 master-0 kubenswrapper[13205]: I0319 09:40:26.285464 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fbc566cb4-69hmp" event={"ID":"91c4bb91-752a-42c9-bc52-61bd8e935269","Type":"ContainerDied","Data":"1cb075b1d3cdcfa9c2f25ad9ef3d5b630182e8d973f6570ecbc3f52075c2302f"} Mar 19 09:40:26.286437 master-0 kubenswrapper[13205]: I0319 09:40:26.285597 13205 scope.go:117] "RemoveContainer" containerID="77924f4a707facf578502f26f206a16ab0771cef3969262568d80c2e52c625d7" Mar 19 09:40:26.286437 master-0 kubenswrapper[13205]: I0319 09:40:26.285621 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fbc566cb4-69hmp" Mar 19 09:40:26.349658 master-0 kubenswrapper[13205]: I0319 09:40:26.349587 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fbc566cb4-69hmp"] Mar 19 09:40:26.363273 master-0 kubenswrapper[13205]: I0319 09:40:26.363199 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-fbc566cb4-69hmp"] Mar 19 09:40:26.865604 master-0 kubenswrapper[13205]: I0319 09:40:26.865496 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91c4bb91-752a-42c9-bc52-61bd8e935269" path="/var/lib/kubelet/pods/91c4bb91-752a-42c9-bc52-61bd8e935269/volumes" Mar 19 09:40:26.866733 master-0 kubenswrapper[13205]: I0319 09:40:26.866681 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de2355a9-4fce-456b-b344-737e9f96d24c" path="/var/lib/kubelet/pods/de2355a9-4fce-456b-b344-737e9f96d24c/volumes" Mar 19 09:40:35.234030 master-0 kubenswrapper[13205]: E0319 09:40:35.233944 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:40:35.244814 master-0 kubenswrapper[13205]: I0319 09:40:35.244725 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-79f67cdc89-bx72w" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" containerID="cri-o://a310c2e6c4b3f15606e08a140ed88a386ad094d2c2a0c14e05f5a9c148af6b08" gracePeriod=15 Mar 19 09:40:35.382353 master-0 kubenswrapper[13205]: I0319 09:40:35.382281 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79f67cdc89-bx72w_5a8e5bd7-de13-4773-8a38-5edf4fda23fd/console/1.log" Mar 19 09:40:35.384961 master-0 kubenswrapper[13205]: I0319 09:40:35.384058 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79f67cdc89-bx72w_5a8e5bd7-de13-4773-8a38-5edf4fda23fd/console/0.log" Mar 19 09:40:35.384961 master-0 kubenswrapper[13205]: I0319 09:40:35.384201 13205 generic.go:334] "Generic (PLEG): container finished" podID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerID="a310c2e6c4b3f15606e08a140ed88a386ad094d2c2a0c14e05f5a9c148af6b08" exitCode=2 Mar 19 09:40:35.384961 master-0 kubenswrapper[13205]: I0319 09:40:35.384261 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79f67cdc89-bx72w" event={"ID":"5a8e5bd7-de13-4773-8a38-5edf4fda23fd","Type":"ContainerDied","Data":"a310c2e6c4b3f15606e08a140ed88a386ad094d2c2a0c14e05f5a9c148af6b08"} Mar 19 09:40:35.384961 master-0 kubenswrapper[13205]: I0319 09:40:35.384334 13205 scope.go:117] "RemoveContainer" containerID="3126c808e06276b72f50b1bcec104cc8290fd8a1252c1d1a5a621abc3da492cd" Mar 19 09:40:35.799014 master-0 kubenswrapper[13205]: I0319 09:40:35.798947 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79f67cdc89-bx72w_5a8e5bd7-de13-4773-8a38-5edf4fda23fd/console/1.log" Mar 19 09:40:35.799208 master-0 kubenswrapper[13205]: I0319 09:40:35.799050 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:40:35.963322 master-0 kubenswrapper[13205]: I0319 09:40:35.963115 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-config\") pod \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " Mar 19 09:40:35.963322 master-0 kubenswrapper[13205]: I0319 09:40:35.963212 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5lv9\" (UniqueName: \"kubernetes.io/projected/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-kube-api-access-m5lv9\") pod \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " Mar 19 09:40:35.963322 master-0 kubenswrapper[13205]: I0319 09:40:35.963264 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-trusted-ca-bundle\") pod \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " Mar 19 09:40:35.963712 master-0 kubenswrapper[13205]: I0319 09:40:35.963357 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-oauth-serving-cert\") pod \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " Mar 19 09:40:35.965223 master-0 kubenswrapper[13205]: I0319 09:40:35.964260 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-oauth-config\") pod \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " Mar 19 09:40:35.965223 master-0 kubenswrapper[13205]: I0319 09:40:35.964332 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5a8e5bd7-de13-4773-8a38-5edf4fda23fd" (UID: "5a8e5bd7-de13-4773-8a38-5edf4fda23fd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:35.965223 master-0 kubenswrapper[13205]: I0319 09:40:35.964437 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-service-ca\") pod \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " Mar 19 09:40:35.965223 master-0 kubenswrapper[13205]: I0319 09:40:35.964561 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-serving-cert\") pod \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\" (UID: \"5a8e5bd7-de13-4773-8a38-5edf4fda23fd\") " Mar 19 09:40:35.965223 master-0 kubenswrapper[13205]: I0319 09:40:35.964610 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-config" (OuterVolumeSpecName: "console-config") pod "5a8e5bd7-de13-4773-8a38-5edf4fda23fd" (UID: "5a8e5bd7-de13-4773-8a38-5edf4fda23fd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:35.965223 master-0 kubenswrapper[13205]: I0319 09:40:35.965176 13205 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:35.965223 master-0 kubenswrapper[13205]: I0319 09:40:35.965197 13205 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:35.965919 master-0 kubenswrapper[13205]: I0319 09:40:35.965401 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-service-ca" (OuterVolumeSpecName: "service-ca") pod "5a8e5bd7-de13-4773-8a38-5edf4fda23fd" (UID: "5a8e5bd7-de13-4773-8a38-5edf4fda23fd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:35.965919 master-0 kubenswrapper[13205]: I0319 09:40:35.965817 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5a8e5bd7-de13-4773-8a38-5edf4fda23fd" (UID: "5a8e5bd7-de13-4773-8a38-5edf4fda23fd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:35.968584 master-0 kubenswrapper[13205]: I0319 09:40:35.968513 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5a8e5bd7-de13-4773-8a38-5edf4fda23fd" (UID: "5a8e5bd7-de13-4773-8a38-5edf4fda23fd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:40:35.969130 master-0 kubenswrapper[13205]: I0319 09:40:35.969090 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-kube-api-access-m5lv9" (OuterVolumeSpecName: "kube-api-access-m5lv9") pod "5a8e5bd7-de13-4773-8a38-5edf4fda23fd" (UID: "5a8e5bd7-de13-4773-8a38-5edf4fda23fd"). InnerVolumeSpecName "kube-api-access-m5lv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:40:35.969769 master-0 kubenswrapper[13205]: I0319 09:40:35.969699 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5a8e5bd7-de13-4773-8a38-5edf4fda23fd" (UID: "5a8e5bd7-de13-4773-8a38-5edf4fda23fd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:40:36.067038 master-0 kubenswrapper[13205]: I0319 09:40:36.066938 13205 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:36.067257 master-0 kubenswrapper[13205]: I0319 09:40:36.067079 13205 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:36.067257 master-0 kubenswrapper[13205]: I0319 09:40:36.067148 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5lv9\" (UniqueName: \"kubernetes.io/projected/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-kube-api-access-m5lv9\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:36.067257 master-0 kubenswrapper[13205]: I0319 09:40:36.067171 13205 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:36.067257 master-0 kubenswrapper[13205]: I0319 09:40:36.067190 13205 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a8e5bd7-de13-4773-8a38-5edf4fda23fd-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:36.401372 master-0 kubenswrapper[13205]: I0319 09:40:36.401249 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79f67cdc89-bx72w_5a8e5bd7-de13-4773-8a38-5edf4fda23fd/console/1.log" Mar 19 09:40:36.402043 master-0 kubenswrapper[13205]: I0319 09:40:36.402013 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79f67cdc89-bx72w" event={"ID":"5a8e5bd7-de13-4773-8a38-5edf4fda23fd","Type":"ContainerDied","Data":"11181626b19d5fb500e0d964d41a94de1d31a7b757550c9a05f23f193dd72f08"} Mar 19 09:40:36.402260 master-0 kubenswrapper[13205]: I0319 09:40:36.402241 13205 scope.go:117] "RemoveContainer" containerID="a310c2e6c4b3f15606e08a140ed88a386ad094d2c2a0c14e05f5a9c148af6b08" Mar 19 09:40:36.402473 master-0 kubenswrapper[13205]: I0319 09:40:36.402459 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79f67cdc89-bx72w" Mar 19 09:40:36.452621 master-0 kubenswrapper[13205]: I0319 09:40:36.452567 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79f67cdc89-bx72w"] Mar 19 09:40:36.457632 master-0 kubenswrapper[13205]: I0319 09:40:36.457596 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-79f67cdc89-bx72w"] Mar 19 09:40:36.864109 master-0 kubenswrapper[13205]: I0319 09:40:36.864060 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" path="/var/lib/kubelet/pods/5a8e5bd7-de13-4773-8a38-5edf4fda23fd/volumes" Mar 19 09:40:48.027451 master-0 kubenswrapper[13205]: I0319 09:40:48.027239 13205 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:40:48.028501 master-0 kubenswrapper[13205]: I0319 09:40:48.027682 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="78163c60e5607dc0ccb2f836459711da" containerName="cluster-policy-controller" containerID="cri-o://909226b2685511d6bab55ace265d3f240cb558432b507591e242a2a343509a3c" gracePeriod=30 Mar 19 09:40:48.028501 master-0 kubenswrapper[13205]: I0319 09:40:48.027794 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://1fa1b12321a3dee01daab35ab4e7b817db6ce8632ee7561cb941776b17b4a6df" gracePeriod=30 Mar 19 09:40:48.028501 master-0 kubenswrapper[13205]: I0319 09:40:48.027825 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager" containerID="cri-o://34566cfceb793a1b567a5c645aa383f0affc1644709bb43d497052e54db18d78" gracePeriod=30 Mar 19 09:40:48.028501 master-0 kubenswrapper[13205]: I0319 09:40:48.027847 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://e69e0a00be938327367cfad5fbbfef5b29328de2c5267b1b1fa3b89a40ee396f" gracePeriod=30 Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.028665 13205 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: E0319 09:40:48.028993 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager-recovery-controller" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029008 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager-recovery-controller" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: E0319 09:40:48.029022 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029031 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: E0319 09:40:48.029048 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029057 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: E0319 09:40:48.029070 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029079 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: E0319 09:40:48.029091 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78163c60e5607dc0ccb2f836459711da" containerName="cluster-policy-controller" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029099 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="78163c60e5607dc0ccb2f836459711da" containerName="cluster-policy-controller" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: E0319 09:40:48.029112 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager-cert-syncer" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029120 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager-cert-syncer" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: E0319 09:40:48.029138 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029146 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: E0319 09:40:48.029158 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2355a9-4fce-456b-b344-737e9f96d24c" containerName="console" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029167 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2355a9-4fce-456b-b344-737e9f96d24c" containerName="console" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: E0319 09:40:48.029180 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c4bb91-752a-42c9-bc52-61bd8e935269" containerName="console" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029188 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c4bb91-752a-42c9-bc52-61bd8e935269" containerName="console" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029368 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c4bb91-752a-42c9-bc52-61bd8e935269" containerName="console" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029383 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2355a9-4fce-456b-b344-737e9f96d24c" containerName="console" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029397 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="78163c60e5607dc0ccb2f836459711da" containerName="cluster-policy-controller" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029416 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager-cert-syncer" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029430 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029438 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029450 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager-recovery-controller" Mar 19 09:40:48.029609 master-0 kubenswrapper[13205]: I0319 09:40:48.029465 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a8e5bd7-de13-4773-8a38-5edf4fda23fd" containerName="console" Mar 19 09:40:48.031397 master-0 kubenswrapper[13205]: I0319 09:40:48.029790 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="78163c60e5607dc0ccb2f836459711da" containerName="kube-controller-manager" Mar 19 09:40:48.091629 master-0 kubenswrapper[13205]: I0319 09:40:48.091472 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a2686abafe708291dc60d5667195e21a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a2686abafe708291dc60d5667195e21a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:40:48.091738 master-0 kubenswrapper[13205]: I0319 09:40:48.091658 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a2686abafe708291dc60d5667195e21a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a2686abafe708291dc60d5667195e21a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:40:48.192916 master-0 kubenswrapper[13205]: I0319 09:40:48.192869 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a2686abafe708291dc60d5667195e21a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a2686abafe708291dc60d5667195e21a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:40:48.193097 master-0 kubenswrapper[13205]: I0319 09:40:48.192944 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a2686abafe708291dc60d5667195e21a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a2686abafe708291dc60d5667195e21a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:40:48.193097 master-0 kubenswrapper[13205]: I0319 09:40:48.193031 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a2686abafe708291dc60d5667195e21a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a2686abafe708291dc60d5667195e21a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:40:48.193344 master-0 kubenswrapper[13205]: I0319 09:40:48.193312 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a2686abafe708291dc60d5667195e21a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a2686abafe708291dc60d5667195e21a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:40:48.277831 master-0 kubenswrapper[13205]: I0319 09:40:48.277671 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_78163c60e5607dc0ccb2f836459711da/kube-controller-manager-cert-syncer/0.log" Mar 19 09:40:48.279159 master-0 kubenswrapper[13205]: I0319 09:40:48.279131 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_78163c60e5607dc0ccb2f836459711da/kube-controller-manager/0.log" Mar 19 09:40:48.279306 master-0 kubenswrapper[13205]: I0319 09:40:48.279227 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:40:48.282740 master-0 kubenswrapper[13205]: I0319 09:40:48.282664 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="78163c60e5607dc0ccb2f836459711da" podUID="a2686abafe708291dc60d5667195e21a" Mar 19 09:40:48.293665 master-0 kubenswrapper[13205]: I0319 09:40:48.293633 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/78163c60e5607dc0ccb2f836459711da-cert-dir\") pod \"78163c60e5607dc0ccb2f836459711da\" (UID: \"78163c60e5607dc0ccb2f836459711da\") " Mar 19 09:40:48.293798 master-0 kubenswrapper[13205]: I0319 09:40:48.293673 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/78163c60e5607dc0ccb2f836459711da-resource-dir\") pod \"78163c60e5607dc0ccb2f836459711da\" (UID: \"78163c60e5607dc0ccb2f836459711da\") " Mar 19 09:40:48.293873 master-0 kubenswrapper[13205]: I0319 09:40:48.293822 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78163c60e5607dc0ccb2f836459711da-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "78163c60e5607dc0ccb2f836459711da" (UID: "78163c60e5607dc0ccb2f836459711da"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:40:48.294019 master-0 kubenswrapper[13205]: I0319 09:40:48.293984 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78163c60e5607dc0ccb2f836459711da-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "78163c60e5607dc0ccb2f836459711da" (UID: "78163c60e5607dc0ccb2f836459711da"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:40:48.294201 master-0 kubenswrapper[13205]: I0319 09:40:48.294170 13205 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/78163c60e5607dc0ccb2f836459711da-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:48.294312 master-0 kubenswrapper[13205]: I0319 09:40:48.294203 13205 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/78163c60e5607dc0ccb2f836459711da-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:48.548694 master-0 kubenswrapper[13205]: I0319 09:40:48.548549 13205 generic.go:334] "Generic (PLEG): container finished" podID="13594ce3-7087-4af3-85eb-6c50b9e2bfd2" containerID="486d73682332891c32f16ccd66225151ea462ed169bd44a16ebd688d0a60ebee" exitCode=0 Mar 19 09:40:48.548694 master-0 kubenswrapper[13205]: I0319 09:40:48.548612 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" event={"ID":"13594ce3-7087-4af3-85eb-6c50b9e2bfd2","Type":"ContainerDied","Data":"486d73682332891c32f16ccd66225151ea462ed169bd44a16ebd688d0a60ebee"} Mar 19 09:40:48.553035 master-0 kubenswrapper[13205]: I0319 09:40:48.552977 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_78163c60e5607dc0ccb2f836459711da/kube-controller-manager-cert-syncer/0.log" Mar 19 09:40:48.554443 master-0 kubenswrapper[13205]: I0319 09:40:48.554410 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_78163c60e5607dc0ccb2f836459711da/kube-controller-manager/0.log" Mar 19 09:40:48.554817 master-0 kubenswrapper[13205]: I0319 09:40:48.554791 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:40:48.554982 master-0 kubenswrapper[13205]: I0319 09:40:48.554731 13205 generic.go:334] "Generic (PLEG): container finished" podID="78163c60e5607dc0ccb2f836459711da" containerID="34566cfceb793a1b567a5c645aa383f0affc1644709bb43d497052e54db18d78" exitCode=0 Mar 19 09:40:48.555062 master-0 kubenswrapper[13205]: I0319 09:40:48.555015 13205 generic.go:334] "Generic (PLEG): container finished" podID="78163c60e5607dc0ccb2f836459711da" containerID="1fa1b12321a3dee01daab35ab4e7b817db6ce8632ee7561cb941776b17b4a6df" exitCode=0 Mar 19 09:40:48.555062 master-0 kubenswrapper[13205]: I0319 09:40:48.555036 13205 generic.go:334] "Generic (PLEG): container finished" podID="78163c60e5607dc0ccb2f836459711da" containerID="e69e0a00be938327367cfad5fbbfef5b29328de2c5267b1b1fa3b89a40ee396f" exitCode=2 Mar 19 09:40:48.555191 master-0 kubenswrapper[13205]: I0319 09:40:48.555070 13205 generic.go:334] "Generic (PLEG): container finished" podID="78163c60e5607dc0ccb2f836459711da" containerID="909226b2685511d6bab55ace265d3f240cb558432b507591e242a2a343509a3c" exitCode=0 Mar 19 09:40:48.555191 master-0 kubenswrapper[13205]: I0319 09:40:48.555122 13205 scope.go:117] "RemoveContainer" containerID="9891f7b5295dc9e748541b1d5291c66e77a0ec82f3b11cb284bbd29bce4baf72" Mar 19 09:40:48.555191 master-0 kubenswrapper[13205]: I0319 09:40:48.555125 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9081822f793a182b6c3527a8e97a4cf0aa7422edceba5bbd9f78d610f192b334" Mar 19 09:40:48.582015 master-0 kubenswrapper[13205]: I0319 09:40:48.578719 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="78163c60e5607dc0ccb2f836459711da" podUID="a2686abafe708291dc60d5667195e21a" Mar 19 09:40:48.588824 master-0 kubenswrapper[13205]: I0319 09:40:48.588734 13205 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="78163c60e5607dc0ccb2f836459711da" podUID="a2686abafe708291dc60d5667195e21a" Mar 19 09:40:48.862378 master-0 kubenswrapper[13205]: I0319 09:40:48.862245 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78163c60e5607dc0ccb2f836459711da" path="/var/lib/kubelet/pods/78163c60e5607dc0ccb2f836459711da/volumes" Mar 19 09:40:49.573435 master-0 kubenswrapper[13205]: I0319 09:40:49.573358 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_78163c60e5607dc0ccb2f836459711da/kube-controller-manager-cert-syncer/0.log" Mar 19 09:40:50.150370 master-0 kubenswrapper[13205]: I0319 09:40:50.150309 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:50.225595 master-0 kubenswrapper[13205]: I0319 09:40:50.225464 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-kubelet-dir\") pod \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\" (UID: \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\") " Mar 19 09:40:50.225895 master-0 kubenswrapper[13205]: I0319 09:40:50.225609 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "13594ce3-7087-4af3-85eb-6c50b9e2bfd2" (UID: "13594ce3-7087-4af3-85eb-6c50b9e2bfd2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:40:50.225895 master-0 kubenswrapper[13205]: I0319 09:40:50.225660 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-kube-api-access\") pod \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\" (UID: \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\") " Mar 19 09:40:50.226065 master-0 kubenswrapper[13205]: I0319 09:40:50.225935 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-var-lock\") pod \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\" (UID: \"13594ce3-7087-4af3-85eb-6c50b9e2bfd2\") " Mar 19 09:40:50.226335 master-0 kubenswrapper[13205]: I0319 09:40:50.226269 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-var-lock" (OuterVolumeSpecName: "var-lock") pod "13594ce3-7087-4af3-85eb-6c50b9e2bfd2" (UID: "13594ce3-7087-4af3-85eb-6c50b9e2bfd2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:40:50.226654 master-0 kubenswrapper[13205]: I0319 09:40:50.226616 13205 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:50.226654 master-0 kubenswrapper[13205]: I0319 09:40:50.226638 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:50.231134 master-0 kubenswrapper[13205]: I0319 09:40:50.231076 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "13594ce3-7087-4af3-85eb-6c50b9e2bfd2" (UID: "13594ce3-7087-4af3-85eb-6c50b9e2bfd2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:40:50.328042 master-0 kubenswrapper[13205]: I0319 09:40:50.327927 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13594ce3-7087-4af3-85eb-6c50b9e2bfd2-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:40:50.590730 master-0 kubenswrapper[13205]: I0319 09:40:50.590636 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" event={"ID":"13594ce3-7087-4af3-85eb-6c50b9e2bfd2","Type":"ContainerDied","Data":"9250eba53f1f68765a2fb388f1c635984bdc78050afad2f9b26969757af7d5ae"} Mar 19 09:40:50.590730 master-0 kubenswrapper[13205]: I0319 09:40:50.590719 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9250eba53f1f68765a2fb388f1c635984bdc78050afad2f9b26969757af7d5ae" Mar 19 09:40:50.592752 master-0 kubenswrapper[13205]: I0319 09:40:50.590757 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-6-retry-1-master-0" Mar 19 09:40:51.623745 master-0 kubenswrapper[13205]: I0319 09:40:51.623675 13205 scope.go:117] "RemoveContainer" containerID="372cd682ac3c0ea2bb18f78daead6727ce073fa9a81ef16d8eb3a25f2f9a5913" Mar 19 09:40:51.642175 master-0 kubenswrapper[13205]: I0319 09:40:51.642129 13205 scope.go:117] "RemoveContainer" containerID="91fc9820bbc378ac3c5d0235d69916e07ec51078314a806281d445caaaf1f9fe" Mar 19 09:40:51.665800 master-0 kubenswrapper[13205]: I0319 09:40:51.665599 13205 scope.go:117] "RemoveContainer" containerID="4ffb489960ae764e7d490cc5a515222762442a3abaf010fa816c5f4dbae9dc07" Mar 19 09:40:51.687715 master-0 kubenswrapper[13205]: I0319 09:40:51.687662 13205 scope.go:117] "RemoveContainer" containerID="0285ae72e28c1ba11c6c5f6c02e25350cf91b897190ff778fd152d801bc77680" Mar 19 09:40:51.702881 master-0 kubenswrapper[13205]: I0319 09:40:51.702814 13205 scope.go:117] "RemoveContainer" containerID="d85c161426a1b0175ad90a172cca4e4d8843322ec3d411bcca9fccf3bb07ad91" Mar 19 09:40:58.849034 master-0 kubenswrapper[13205]: I0319 09:40:58.848902 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:40:58.878914 master-0 kubenswrapper[13205]: I0319 09:40:58.878862 13205 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7ab8a1cf-a389-4cc9-aded-1c30fc67bebb" Mar 19 09:40:58.878914 master-0 kubenswrapper[13205]: I0319 09:40:58.878906 13205 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7ab8a1cf-a389-4cc9-aded-1c30fc67bebb" Mar 19 09:40:58.893839 master-0 kubenswrapper[13205]: I0319 09:40:58.893628 13205 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:40:58.900787 master-0 kubenswrapper[13205]: I0319 09:40:58.900743 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:40:58.906885 master-0 kubenswrapper[13205]: I0319 09:40:58.906825 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:40:58.923254 master-0 kubenswrapper[13205]: I0319 09:40:58.923166 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:40:58.929851 master-0 kubenswrapper[13205]: I0319 09:40:58.929767 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:40:58.938062 master-0 kubenswrapper[13205]: W0319 09:40:58.937918 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2686abafe708291dc60d5667195e21a.slice/crio-587db99616752d2d89d52c0e30db71403ac0c8e7e4ea237f62779ec449c37af4 WatchSource:0}: Error finding container 587db99616752d2d89d52c0e30db71403ac0c8e7e4ea237f62779ec449c37af4: Status 404 returned error can't find the container with id 587db99616752d2d89d52c0e30db71403ac0c8e7e4ea237f62779ec449c37af4 Mar 19 09:40:59.675165 master-0 kubenswrapper[13205]: I0319 09:40:59.675090 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a2686abafe708291dc60d5667195e21a","Type":"ContainerStarted","Data":"1ceeeb603541cfb9245562ef9b2e90ea9cab14baa2c43d484b0c2c09845fc0bf"} Mar 19 09:40:59.675165 master-0 kubenswrapper[13205]: I0319 09:40:59.675142 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a2686abafe708291dc60d5667195e21a","Type":"ContainerStarted","Data":"e677235dff2ea1ee6d7bfd1b5bce184c9142db452892918b4e4f54add06c1be5"} Mar 19 09:40:59.675165 master-0 kubenswrapper[13205]: I0319 09:40:59.675151 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a2686abafe708291dc60d5667195e21a","Type":"ContainerStarted","Data":"587db99616752d2d89d52c0e30db71403ac0c8e7e4ea237f62779ec449c37af4"} Mar 19 09:41:00.686346 master-0 kubenswrapper[13205]: I0319 09:41:00.686289 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a2686abafe708291dc60d5667195e21a","Type":"ContainerStarted","Data":"26c784119cdcdd59f407d80975f45fa88e528d72d567678193d462575685056f"} Mar 19 09:41:00.686346 master-0 kubenswrapper[13205]: I0319 09:41:00.686343 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a2686abafe708291dc60d5667195e21a","Type":"ContainerStarted","Data":"64e5f60be1a646de08f9ec1a2b4dca43d13271ad067551d0a16caf06d892ee21"} Mar 19 09:41:00.706136 master-0 kubenswrapper[13205]: I0319 09:41:00.706038 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.706012924 podStartE2EDuration="2.706012924s" podCreationTimestamp="2026-03-19 09:40:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:41:00.703637216 +0000 UTC m=+1046.035944104" watchObservedRunningTime="2026-03-19 09:41:00.706012924 +0000 UTC m=+1046.038319832" Mar 19 09:41:08.906999 master-0 kubenswrapper[13205]: I0319 09:41:08.906921 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:41:08.906999 master-0 kubenswrapper[13205]: I0319 09:41:08.906994 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:41:08.906999 master-0 kubenswrapper[13205]: I0319 09:41:08.907015 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:41:08.906999 master-0 kubenswrapper[13205]: I0319 09:41:08.907027 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:41:08.907846 master-0 kubenswrapper[13205]: I0319 09:41:08.907364 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 09:41:08.907846 master-0 kubenswrapper[13205]: I0319 09:41:08.907449 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a2686abafe708291dc60d5667195e21a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 09:41:08.913463 master-0 kubenswrapper[13205]: I0319 09:41:08.913420 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:41:09.775865 master-0 kubenswrapper[13205]: I0319 09:41:09.775783 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:41:18.907692 master-0 kubenswrapper[13205]: I0319 09:41:18.907631 13205 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 09:41:18.908885 master-0 kubenswrapper[13205]: I0319 09:41:18.908803 13205 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a2686abafe708291dc60d5667195e21a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 09:41:28.914171 master-0 kubenswrapper[13205]: I0319 09:41:28.914097 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:41:28.917799 master-0 kubenswrapper[13205]: I0319 09:41:28.917754 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:41:30.561373 master-0 kubenswrapper[13205]: I0319 09:41:30.561303 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-6-master-0"] Mar 19 09:41:30.562071 master-0 kubenswrapper[13205]: E0319 09:41:30.561859 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13594ce3-7087-4af3-85eb-6c50b9e2bfd2" containerName="installer" Mar 19 09:41:30.562071 master-0 kubenswrapper[13205]: I0319 09:41:30.561888 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="13594ce3-7087-4af3-85eb-6c50b9e2bfd2" containerName="installer" Mar 19 09:41:30.562206 master-0 kubenswrapper[13205]: I0319 09:41:30.562165 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="13594ce3-7087-4af3-85eb-6c50b9e2bfd2" containerName="installer" Mar 19 09:41:30.562999 master-0 kubenswrapper[13205]: I0319 09:41:30.562939 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-6-master-0" Mar 19 09:41:30.567612 master-0 kubenswrapper[13205]: I0319 09:41:30.566994 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-nm2j7" Mar 19 09:41:30.567765 master-0 kubenswrapper[13205]: I0319 09:41:30.567656 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 09:41:30.587210 master-0 kubenswrapper[13205]: I0319 09:41:30.587118 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-6-master-0"] Mar 19 09:41:30.608214 master-0 kubenswrapper[13205]: I0319 09:41:30.608101 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1228a411-677c-4ba0-96bb-9c6825839313-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"1228a411-677c-4ba0-96bb-9c6825839313\") " pod="openshift-kube-controller-manager/revision-pruner-6-master-0" Mar 19 09:41:30.608421 master-0 kubenswrapper[13205]: I0319 09:41:30.608394 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1228a411-677c-4ba0-96bb-9c6825839313-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"1228a411-677c-4ba0-96bb-9c6825839313\") " pod="openshift-kube-controller-manager/revision-pruner-6-master-0" Mar 19 09:41:30.709564 master-0 kubenswrapper[13205]: I0319 09:41:30.709470 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1228a411-677c-4ba0-96bb-9c6825839313-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"1228a411-677c-4ba0-96bb-9c6825839313\") " pod="openshift-kube-controller-manager/revision-pruner-6-master-0" Mar 19 09:41:30.709841 master-0 kubenswrapper[13205]: I0319 09:41:30.709782 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1228a411-677c-4ba0-96bb-9c6825839313-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"1228a411-677c-4ba0-96bb-9c6825839313\") " pod="openshift-kube-controller-manager/revision-pruner-6-master-0" Mar 19 09:41:30.709907 master-0 kubenswrapper[13205]: I0319 09:41:30.709893 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1228a411-677c-4ba0-96bb-9c6825839313-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"1228a411-677c-4ba0-96bb-9c6825839313\") " pod="openshift-kube-controller-manager/revision-pruner-6-master-0" Mar 19 09:41:30.729559 master-0 kubenswrapper[13205]: I0319 09:41:30.729469 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1228a411-677c-4ba0-96bb-9c6825839313-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"1228a411-677c-4ba0-96bb-9c6825839313\") " pod="openshift-kube-controller-manager/revision-pruner-6-master-0" Mar 19 09:41:30.914876 master-0 kubenswrapper[13205]: I0319 09:41:30.914695 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-6-master-0" Mar 19 09:41:31.418719 master-0 kubenswrapper[13205]: I0319 09:41:31.416426 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-6-master-0"] Mar 19 09:41:31.419766 master-0 kubenswrapper[13205]: W0319 09:41:31.419132 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1228a411_677c_4ba0_96bb_9c6825839313.slice/crio-1111931e0699be9b0137bec94b8ea0a1860f68bbe51549ce205dc152fbf47505 WatchSource:0}: Error finding container 1111931e0699be9b0137bec94b8ea0a1860f68bbe51549ce205dc152fbf47505: Status 404 returned error can't find the container with id 1111931e0699be9b0137bec94b8ea0a1860f68bbe51549ce205dc152fbf47505 Mar 19 09:41:32.011955 master-0 kubenswrapper[13205]: I0319 09:41:32.011757 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-6-master-0" event={"ID":"1228a411-677c-4ba0-96bb-9c6825839313","Type":"ContainerStarted","Data":"dcce7d813551da2037de797bb6134ef05edd2d1a899f5671be19d90f58a1ed2a"} Mar 19 09:41:32.011955 master-0 kubenswrapper[13205]: I0319 09:41:32.011855 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-6-master-0" event={"ID":"1228a411-677c-4ba0-96bb-9c6825839313","Type":"ContainerStarted","Data":"1111931e0699be9b0137bec94b8ea0a1860f68bbe51549ce205dc152fbf47505"} Mar 19 09:41:32.035990 master-0 kubenswrapper[13205]: I0319 09:41:32.035873 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-6-master-0" podStartSLOduration=2.035848171 podStartE2EDuration="2.035848171s" podCreationTimestamp="2026-03-19 09:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:41:32.033898333 +0000 UTC m=+1077.366205251" watchObservedRunningTime="2026-03-19 09:41:32.035848171 +0000 UTC m=+1077.368155059" Mar 19 09:41:33.025635 master-0 kubenswrapper[13205]: I0319 09:41:33.025574 13205 generic.go:334] "Generic (PLEG): container finished" podID="1228a411-677c-4ba0-96bb-9c6825839313" containerID="dcce7d813551da2037de797bb6134ef05edd2d1a899f5671be19d90f58a1ed2a" exitCode=0 Mar 19 09:41:33.027410 master-0 kubenswrapper[13205]: I0319 09:41:33.025646 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-6-master-0" event={"ID":"1228a411-677c-4ba0-96bb-9c6825839313","Type":"ContainerDied","Data":"dcce7d813551da2037de797bb6134ef05edd2d1a899f5671be19d90f58a1ed2a"} Mar 19 09:41:34.467319 master-0 kubenswrapper[13205]: I0319 09:41:34.467219 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-6-master-0" Mar 19 09:41:34.621908 master-0 kubenswrapper[13205]: I0319 09:41:34.621804 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1228a411-677c-4ba0-96bb-9c6825839313-kubelet-dir\") pod \"1228a411-677c-4ba0-96bb-9c6825839313\" (UID: \"1228a411-677c-4ba0-96bb-9c6825839313\") " Mar 19 09:41:34.622154 master-0 kubenswrapper[13205]: I0319 09:41:34.621946 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1228a411-677c-4ba0-96bb-9c6825839313-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1228a411-677c-4ba0-96bb-9c6825839313" (UID: "1228a411-677c-4ba0-96bb-9c6825839313"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:41:34.622154 master-0 kubenswrapper[13205]: I0319 09:41:34.622073 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1228a411-677c-4ba0-96bb-9c6825839313-kube-api-access\") pod \"1228a411-677c-4ba0-96bb-9c6825839313\" (UID: \"1228a411-677c-4ba0-96bb-9c6825839313\") " Mar 19 09:41:34.622419 master-0 kubenswrapper[13205]: I0319 09:41:34.622380 13205 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1228a411-677c-4ba0-96bb-9c6825839313-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:41:34.626461 master-0 kubenswrapper[13205]: I0319 09:41:34.626389 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1228a411-677c-4ba0-96bb-9c6825839313-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1228a411-677c-4ba0-96bb-9c6825839313" (UID: "1228a411-677c-4ba0-96bb-9c6825839313"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:41:34.725110 master-0 kubenswrapper[13205]: I0319 09:41:34.724988 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1228a411-677c-4ba0-96bb-9c6825839313-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:41:35.049795 master-0 kubenswrapper[13205]: I0319 09:41:35.049722 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-6-master-0" event={"ID":"1228a411-677c-4ba0-96bb-9c6825839313","Type":"ContainerDied","Data":"1111931e0699be9b0137bec94b8ea0a1860f68bbe51549ce205dc152fbf47505"} Mar 19 09:41:35.049795 master-0 kubenswrapper[13205]: I0319 09:41:35.049791 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1111931e0699be9b0137bec94b8ea0a1860f68bbe51549ce205dc152fbf47505" Mar 19 09:41:35.050401 master-0 kubenswrapper[13205]: I0319 09:41:35.049861 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-6-master-0" Mar 19 09:41:35.232481 master-0 kubenswrapper[13205]: E0319 09:41:35.232387 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:41:51.815425 master-0 kubenswrapper[13205]: I0319 09:41:51.815341 13205 scope.go:117] "RemoveContainer" containerID="909226b2685511d6bab55ace265d3f240cb558432b507591e242a2a343509a3c" Mar 19 09:41:51.837608 master-0 kubenswrapper[13205]: I0319 09:41:51.837556 13205 scope.go:117] "RemoveContainer" containerID="e69e0a00be938327367cfad5fbbfef5b29328de2c5267b1b1fa3b89a40ee396f" Mar 19 09:41:51.859671 master-0 kubenswrapper[13205]: I0319 09:41:51.859623 13205 scope.go:117] "RemoveContainer" containerID="1fa1b12321a3dee01daab35ab4e7b817db6ce8632ee7561cb941776b17b4a6df" Mar 19 09:41:51.942818 master-0 kubenswrapper[13205]: I0319 09:41:51.942728 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 09:41:51.952090 master-0 kubenswrapper[13205]: I0319 09:41:51.952024 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 09:41:52.858543 master-0 kubenswrapper[13205]: I0319 09:41:52.858475 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ca4232-9e9c-4b97-9c29-bead80a9a5fa" path="/var/lib/kubelet/pods/43ca4232-9e9c-4b97-9c29-bead80a9a5fa/volumes" Mar 19 09:42:35.228970 master-0 kubenswrapper[13205]: E0319 09:42:35.228913 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:42:51.895603 master-0 kubenswrapper[13205]: I0319 09:42:51.895560 13205 scope.go:117] "RemoveContainer" containerID="46c63e43dc61899ca4cb1732e5d7d4e693a722f5fb486db67fb30cfa5bfc8af5" Mar 19 09:42:57.701274 master-0 kubenswrapper[13205]: I0319 09:42:57.701171 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q"] Mar 19 09:42:57.702342 master-0 kubenswrapper[13205]: E0319 09:42:57.701741 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1228a411-677c-4ba0-96bb-9c6825839313" containerName="pruner" Mar 19 09:42:57.702342 master-0 kubenswrapper[13205]: I0319 09:42:57.701776 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="1228a411-677c-4ba0-96bb-9c6825839313" containerName="pruner" Mar 19 09:42:57.702342 master-0 kubenswrapper[13205]: I0319 09:42:57.702158 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="1228a411-677c-4ba0-96bb-9c6825839313" containerName="pruner" Mar 19 09:42:57.704415 master-0 kubenswrapper[13205]: I0319 09:42:57.704354 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:42:57.713949 master-0 kubenswrapper[13205]: I0319 09:42:57.713728 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q"] Mar 19 09:42:57.741140 master-0 kubenswrapper[13205]: I0319 09:42:57.741031 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl86x\" (UniqueName: \"kubernetes.io/projected/4f682b6b-6bed-4d9a-9610-5c773e62c01c-kube-api-access-jl86x\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q\" (UID: \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:42:57.741470 master-0 kubenswrapper[13205]: I0319 09:42:57.741408 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f682b6b-6bed-4d9a-9610-5c773e62c01c-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q\" (UID: \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:42:57.741470 master-0 kubenswrapper[13205]: I0319 09:42:57.741469 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f682b6b-6bed-4d9a-9610-5c773e62c01c-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q\" (UID: \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:42:57.842676 master-0 kubenswrapper[13205]: I0319 09:42:57.842588 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f682b6b-6bed-4d9a-9610-5c773e62c01c-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q\" (UID: \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:42:57.842676 master-0 kubenswrapper[13205]: I0319 09:42:57.842671 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f682b6b-6bed-4d9a-9610-5c773e62c01c-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q\" (UID: \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:42:57.843123 master-0 kubenswrapper[13205]: I0319 09:42:57.842733 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl86x\" (UniqueName: \"kubernetes.io/projected/4f682b6b-6bed-4d9a-9610-5c773e62c01c-kube-api-access-jl86x\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q\" (UID: \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:42:57.843338 master-0 kubenswrapper[13205]: I0319 09:42:57.843299 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f682b6b-6bed-4d9a-9610-5c773e62c01c-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q\" (UID: \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:42:57.844362 master-0 kubenswrapper[13205]: I0319 09:42:57.844307 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f682b6b-6bed-4d9a-9610-5c773e62c01c-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q\" (UID: \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:42:57.874647 master-0 kubenswrapper[13205]: I0319 09:42:57.872067 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl86x\" (UniqueName: \"kubernetes.io/projected/4f682b6b-6bed-4d9a-9610-5c773e62c01c-kube-api-access-jl86x\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q\" (UID: \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:42:58.031642 master-0 kubenswrapper[13205]: I0319 09:42:58.031520 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:42:58.515387 master-0 kubenswrapper[13205]: I0319 09:42:58.514893 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q"] Mar 19 09:42:58.515594 master-0 kubenswrapper[13205]: W0319 09:42:58.515476 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f682b6b_6bed_4d9a_9610_5c773e62c01c.slice/crio-126f51ca569da4ee051d924f4f9e6f4f6cac906a5443f4e55203a9130daa9464 WatchSource:0}: Error finding container 126f51ca569da4ee051d924f4f9e6f4f6cac906a5443f4e55203a9130daa9464: Status 404 returned error can't find the container with id 126f51ca569da4ee051d924f4f9e6f4f6cac906a5443f4e55203a9130daa9464 Mar 19 09:42:58.869636 master-0 kubenswrapper[13205]: I0319 09:42:58.869561 13205 generic.go:334] "Generic (PLEG): container finished" podID="4f682b6b-6bed-4d9a-9610-5c773e62c01c" containerID="e1ebe981e75f5aa8f58553e5194bf8bed5a3578a455794f15e9e07852f5c6f35" exitCode=0 Mar 19 09:42:58.871706 master-0 kubenswrapper[13205]: I0319 09:42:58.869657 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" event={"ID":"4f682b6b-6bed-4d9a-9610-5c773e62c01c","Type":"ContainerDied","Data":"e1ebe981e75f5aa8f58553e5194bf8bed5a3578a455794f15e9e07852f5c6f35"} Mar 19 09:42:58.872029 master-0 kubenswrapper[13205]: I0319 09:42:58.871984 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" event={"ID":"4f682b6b-6bed-4d9a-9610-5c773e62c01c","Type":"ContainerStarted","Data":"126f51ca569da4ee051d924f4f9e6f4f6cac906a5443f4e55203a9130daa9464"} Mar 19 09:42:58.872234 master-0 kubenswrapper[13205]: I0319 09:42:58.870868 13205 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:43:00.424880 master-0 kubenswrapper[13205]: E0319 09:43:00.424723 13205 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f682b6b_6bed_4d9a_9610_5c773e62c01c.slice/crio-f14815a2fc0ebb2efe7f2c1769cedce5b10f460ba227d9a894d28dd115a0bf17.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f682b6b_6bed_4d9a_9610_5c773e62c01c.slice/crio-conmon-f14815a2fc0ebb2efe7f2c1769cedce5b10f460ba227d9a894d28dd115a0bf17.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:43:00.896110 master-0 kubenswrapper[13205]: I0319 09:43:00.895989 13205 generic.go:334] "Generic (PLEG): container finished" podID="4f682b6b-6bed-4d9a-9610-5c773e62c01c" containerID="f14815a2fc0ebb2efe7f2c1769cedce5b10f460ba227d9a894d28dd115a0bf17" exitCode=0 Mar 19 09:43:00.896110 master-0 kubenswrapper[13205]: I0319 09:43:00.896079 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" event={"ID":"4f682b6b-6bed-4d9a-9610-5c773e62c01c","Type":"ContainerDied","Data":"f14815a2fc0ebb2efe7f2c1769cedce5b10f460ba227d9a894d28dd115a0bf17"} Mar 19 09:43:01.909026 master-0 kubenswrapper[13205]: I0319 09:43:01.908970 13205 generic.go:334] "Generic (PLEG): container finished" podID="4f682b6b-6bed-4d9a-9610-5c773e62c01c" containerID="0c7745b80461efccd508c29469d2063c5d4ea6ed0674da49a5e1134bd199701d" exitCode=0 Mar 19 09:43:01.909026 master-0 kubenswrapper[13205]: I0319 09:43:01.909008 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" event={"ID":"4f682b6b-6bed-4d9a-9610-5c773e62c01c","Type":"ContainerDied","Data":"0c7745b80461efccd508c29469d2063c5d4ea6ed0674da49a5e1134bd199701d"} Mar 19 09:43:03.326342 master-0 kubenswrapper[13205]: I0319 09:43:03.326245 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:43:03.449104 master-0 kubenswrapper[13205]: I0319 09:43:03.448867 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f682b6b-6bed-4d9a-9610-5c773e62c01c-util\") pod \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\" (UID: \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\") " Mar 19 09:43:03.449581 master-0 kubenswrapper[13205]: I0319 09:43:03.449509 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl86x\" (UniqueName: \"kubernetes.io/projected/4f682b6b-6bed-4d9a-9610-5c773e62c01c-kube-api-access-jl86x\") pod \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\" (UID: \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\") " Mar 19 09:43:03.451225 master-0 kubenswrapper[13205]: I0319 09:43:03.450880 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f682b6b-6bed-4d9a-9610-5c773e62c01c-bundle\") pod \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\" (UID: \"4f682b6b-6bed-4d9a-9610-5c773e62c01c\") " Mar 19 09:43:03.451925 master-0 kubenswrapper[13205]: I0319 09:43:03.451670 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f682b6b-6bed-4d9a-9610-5c773e62c01c-bundle" (OuterVolumeSpecName: "bundle") pod "4f682b6b-6bed-4d9a-9610-5c773e62c01c" (UID: "4f682b6b-6bed-4d9a-9610-5c773e62c01c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:43:03.453123 master-0 kubenswrapper[13205]: I0319 09:43:03.453028 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f682b6b-6bed-4d9a-9610-5c773e62c01c-kube-api-access-jl86x" (OuterVolumeSpecName: "kube-api-access-jl86x") pod "4f682b6b-6bed-4d9a-9610-5c773e62c01c" (UID: "4f682b6b-6bed-4d9a-9610-5c773e62c01c"). InnerVolumeSpecName "kube-api-access-jl86x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:43:03.481945 master-0 kubenswrapper[13205]: I0319 09:43:03.481808 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f682b6b-6bed-4d9a-9610-5c773e62c01c-util" (OuterVolumeSpecName: "util") pod "4f682b6b-6bed-4d9a-9610-5c773e62c01c" (UID: "4f682b6b-6bed-4d9a-9610-5c773e62c01c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:43:03.553874 master-0 kubenswrapper[13205]: I0319 09:43:03.553810 13205 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f682b6b-6bed-4d9a-9610-5c773e62c01c-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:03.554235 master-0 kubenswrapper[13205]: I0319 09:43:03.554205 13205 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f682b6b-6bed-4d9a-9610-5c773e62c01c-util\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:03.554429 master-0 kubenswrapper[13205]: I0319 09:43:03.554398 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl86x\" (UniqueName: \"kubernetes.io/projected/4f682b6b-6bed-4d9a-9610-5c773e62c01c-kube-api-access-jl86x\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:03.936377 master-0 kubenswrapper[13205]: I0319 09:43:03.936184 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" event={"ID":"4f682b6b-6bed-4d9a-9610-5c773e62c01c","Type":"ContainerDied","Data":"126f51ca569da4ee051d924f4f9e6f4f6cac906a5443f4e55203a9130daa9464"} Mar 19 09:43:03.936377 master-0 kubenswrapper[13205]: I0319 09:43:03.936256 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="126f51ca569da4ee051d924f4f9e6f4f6cac906a5443f4e55203a9130daa9464" Mar 19 09:43:03.936750 master-0 kubenswrapper[13205]: I0319 09:43:03.936296 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4frj4q" Mar 19 09:43:10.760860 master-0 kubenswrapper[13205]: I0319 09:43:10.760787 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-6577d5757-2shjw"] Mar 19 09:43:10.761809 master-0 kubenswrapper[13205]: E0319 09:43:10.761047 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f682b6b-6bed-4d9a-9610-5c773e62c01c" containerName="pull" Mar 19 09:43:10.761809 master-0 kubenswrapper[13205]: I0319 09:43:10.761059 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f682b6b-6bed-4d9a-9610-5c773e62c01c" containerName="pull" Mar 19 09:43:10.761809 master-0 kubenswrapper[13205]: E0319 09:43:10.761075 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f682b6b-6bed-4d9a-9610-5c773e62c01c" containerName="extract" Mar 19 09:43:10.761809 master-0 kubenswrapper[13205]: I0319 09:43:10.761080 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f682b6b-6bed-4d9a-9610-5c773e62c01c" containerName="extract" Mar 19 09:43:10.761809 master-0 kubenswrapper[13205]: E0319 09:43:10.761098 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f682b6b-6bed-4d9a-9610-5c773e62c01c" containerName="util" Mar 19 09:43:10.761809 master-0 kubenswrapper[13205]: I0319 09:43:10.761106 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f682b6b-6bed-4d9a-9610-5c773e62c01c" containerName="util" Mar 19 09:43:10.761809 master-0 kubenswrapper[13205]: I0319 09:43:10.761233 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f682b6b-6bed-4d9a-9610-5c773e62c01c" containerName="extract" Mar 19 09:43:10.761809 master-0 kubenswrapper[13205]: I0319 09:43:10.761660 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.764440 master-0 kubenswrapper[13205]: I0319 09:43:10.764204 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Mar 19 09:43:10.768012 master-0 kubenswrapper[13205]: I0319 09:43:10.767775 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Mar 19 09:43:10.768012 master-0 kubenswrapper[13205]: I0319 09:43:10.767798 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Mar 19 09:43:10.768012 master-0 kubenswrapper[13205]: I0319 09:43:10.767855 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Mar 19 09:43:10.768433 master-0 kubenswrapper[13205]: I0319 09:43:10.768301 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Mar 19 09:43:10.789001 master-0 kubenswrapper[13205]: I0319 09:43:10.788932 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-6577d5757-2shjw"] Mar 19 09:43:10.880272 master-0 kubenswrapper[13205]: I0319 09:43:10.880227 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65beb37f-3f54-4de2-84af-04c9d50784f9-apiservice-cert\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.880272 master-0 kubenswrapper[13205]: I0319 09:43:10.880279 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qbmw\" (UniqueName: \"kubernetes.io/projected/65beb37f-3f54-4de2-84af-04c9d50784f9-kube-api-access-9qbmw\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.880555 master-0 kubenswrapper[13205]: I0319 09:43:10.880322 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/65beb37f-3f54-4de2-84af-04c9d50784f9-socket-dir\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.880555 master-0 kubenswrapper[13205]: I0319 09:43:10.880343 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65beb37f-3f54-4de2-84af-04c9d50784f9-webhook-cert\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.880625 master-0 kubenswrapper[13205]: I0319 09:43:10.880543 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/65beb37f-3f54-4de2-84af-04c9d50784f9-metrics-cert\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.981984 master-0 kubenswrapper[13205]: I0319 09:43:10.981765 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65beb37f-3f54-4de2-84af-04c9d50784f9-apiservice-cert\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.983246 master-0 kubenswrapper[13205]: I0319 09:43:10.982367 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qbmw\" (UniqueName: \"kubernetes.io/projected/65beb37f-3f54-4de2-84af-04c9d50784f9-kube-api-access-9qbmw\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.983246 master-0 kubenswrapper[13205]: I0319 09:43:10.982512 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/65beb37f-3f54-4de2-84af-04c9d50784f9-socket-dir\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.983246 master-0 kubenswrapper[13205]: I0319 09:43:10.982556 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65beb37f-3f54-4de2-84af-04c9d50784f9-webhook-cert\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.983246 master-0 kubenswrapper[13205]: I0319 09:43:10.982697 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/65beb37f-3f54-4de2-84af-04c9d50784f9-metrics-cert\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.983839 master-0 kubenswrapper[13205]: I0319 09:43:10.983805 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/65beb37f-3f54-4de2-84af-04c9d50784f9-socket-dir\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.984969 master-0 kubenswrapper[13205]: I0319 09:43:10.984905 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65beb37f-3f54-4de2-84af-04c9d50784f9-apiservice-cert\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.986820 master-0 kubenswrapper[13205]: I0319 09:43:10.986789 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65beb37f-3f54-4de2-84af-04c9d50784f9-webhook-cert\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:10.987680 master-0 kubenswrapper[13205]: I0319 09:43:10.987195 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/65beb37f-3f54-4de2-84af-04c9d50784f9-metrics-cert\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:11.000063 master-0 kubenswrapper[13205]: I0319 09:43:11.000025 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qbmw\" (UniqueName: \"kubernetes.io/projected/65beb37f-3f54-4de2-84af-04c9d50784f9-kube-api-access-9qbmw\") pod \"lvms-operator-6577d5757-2shjw\" (UID: \"65beb37f-3f54-4de2-84af-04c9d50784f9\") " pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:11.078584 master-0 kubenswrapper[13205]: I0319 09:43:11.078504 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:11.541016 master-0 kubenswrapper[13205]: I0319 09:43:11.540966 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-6577d5757-2shjw"] Mar 19 09:43:11.546487 master-0 kubenswrapper[13205]: W0319 09:43:11.546412 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65beb37f_3f54_4de2_84af_04c9d50784f9.slice/crio-be8351600449ad211f7b0e6e276a704d1b0cf627289b802096c1a16caf5b1efd WatchSource:0}: Error finding container be8351600449ad211f7b0e6e276a704d1b0cf627289b802096c1a16caf5b1efd: Status 404 returned error can't find the container with id be8351600449ad211f7b0e6e276a704d1b0cf627289b802096c1a16caf5b1efd Mar 19 09:43:12.004751 master-0 kubenswrapper[13205]: I0319 09:43:12.004028 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-6577d5757-2shjw" event={"ID":"65beb37f-3f54-4de2-84af-04c9d50784f9","Type":"ContainerStarted","Data":"be8351600449ad211f7b0e6e276a704d1b0cf627289b802096c1a16caf5b1efd"} Mar 19 09:43:17.053692 master-0 kubenswrapper[13205]: I0319 09:43:17.053590 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-6577d5757-2shjw" event={"ID":"65beb37f-3f54-4de2-84af-04c9d50784f9","Type":"ContainerStarted","Data":"fc2e5660a5c63eb36b15f531673ad1c4f295b103a33017dfb4aa78d24ee9b232"} Mar 19 09:43:17.055447 master-0 kubenswrapper[13205]: I0319 09:43:17.053965 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:17.057790 master-0 kubenswrapper[13205]: I0319 09:43:17.057191 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-6577d5757-2shjw" Mar 19 09:43:17.089280 master-0 kubenswrapper[13205]: I0319 09:43:17.089049 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-6577d5757-2shjw" podStartSLOduration=2.231320695 podStartE2EDuration="7.08900971s" podCreationTimestamp="2026-03-19 09:43:10 +0000 UTC" firstStartedPulling="2026-03-19 09:43:11.549773138 +0000 UTC m=+1176.882080026" lastFinishedPulling="2026-03-19 09:43:16.407462153 +0000 UTC m=+1181.739769041" observedRunningTime="2026-03-19 09:43:17.074510588 +0000 UTC m=+1182.406817486" watchObservedRunningTime="2026-03-19 09:43:17.08900971 +0000 UTC m=+1182.421316648" Mar 19 09:43:21.066222 master-0 kubenswrapper[13205]: I0319 09:43:21.066141 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7"] Mar 19 09:43:21.068453 master-0 kubenswrapper[13205]: I0319 09:43:21.068410 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:21.120735 master-0 kubenswrapper[13205]: I0319 09:43:21.120670 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7"] Mar 19 09:43:21.173288 master-0 kubenswrapper[13205]: I0319 09:43:21.173220 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dea5f323-0902-4f06-a00d-2e60ddd72d84-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7\" (UID: \"dea5f323-0902-4f06-a00d-2e60ddd72d84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:21.173288 master-0 kubenswrapper[13205]: I0319 09:43:21.173294 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dea5f323-0902-4f06-a00d-2e60ddd72d84-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7\" (UID: \"dea5f323-0902-4f06-a00d-2e60ddd72d84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:21.174025 master-0 kubenswrapper[13205]: I0319 09:43:21.173986 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5wjd\" (UniqueName: \"kubernetes.io/projected/dea5f323-0902-4f06-a00d-2e60ddd72d84-kube-api-access-z5wjd\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7\" (UID: \"dea5f323-0902-4f06-a00d-2e60ddd72d84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:21.274956 master-0 kubenswrapper[13205]: I0319 09:43:21.274889 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5wjd\" (UniqueName: \"kubernetes.io/projected/dea5f323-0902-4f06-a00d-2e60ddd72d84-kube-api-access-z5wjd\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7\" (UID: \"dea5f323-0902-4f06-a00d-2e60ddd72d84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:21.274956 master-0 kubenswrapper[13205]: I0319 09:43:21.274961 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dea5f323-0902-4f06-a00d-2e60ddd72d84-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7\" (UID: \"dea5f323-0902-4f06-a00d-2e60ddd72d84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:21.275367 master-0 kubenswrapper[13205]: I0319 09:43:21.275285 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dea5f323-0902-4f06-a00d-2e60ddd72d84-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7\" (UID: \"dea5f323-0902-4f06-a00d-2e60ddd72d84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:21.275704 master-0 kubenswrapper[13205]: I0319 09:43:21.275664 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dea5f323-0902-4f06-a00d-2e60ddd72d84-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7\" (UID: \"dea5f323-0902-4f06-a00d-2e60ddd72d84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:21.277199 master-0 kubenswrapper[13205]: I0319 09:43:21.277161 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dea5f323-0902-4f06-a00d-2e60ddd72d84-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7\" (UID: \"dea5f323-0902-4f06-a00d-2e60ddd72d84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:21.571080 master-0 kubenswrapper[13205]: I0319 09:43:21.571031 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5wjd\" (UniqueName: \"kubernetes.io/projected/dea5f323-0902-4f06-a00d-2e60ddd72d84-kube-api-access-z5wjd\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7\" (UID: \"dea5f323-0902-4f06-a00d-2e60ddd72d84\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:21.685088 master-0 kubenswrapper[13205]: I0319 09:43:21.685017 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:21.694740 master-0 kubenswrapper[13205]: I0319 09:43:21.694682 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw"] Mar 19 09:43:21.697428 master-0 kubenswrapper[13205]: I0319 09:43:21.697393 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:21.749503 master-0 kubenswrapper[13205]: I0319 09:43:21.749454 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw"] Mar 19 09:43:21.886923 master-0 kubenswrapper[13205]: I0319 09:43:21.886473 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw\" (UID: \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:21.886923 master-0 kubenswrapper[13205]: I0319 09:43:21.886638 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdpcl\" (UniqueName: \"kubernetes.io/projected/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-kube-api-access-xdpcl\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw\" (UID: \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:21.886923 master-0 kubenswrapper[13205]: I0319 09:43:21.886714 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw\" (UID: \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:21.988582 master-0 kubenswrapper[13205]: I0319 09:43:21.988472 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw\" (UID: \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:21.988809 master-0 kubenswrapper[13205]: I0319 09:43:21.988599 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdpcl\" (UniqueName: \"kubernetes.io/projected/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-kube-api-access-xdpcl\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw\" (UID: \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:21.989252 master-0 kubenswrapper[13205]: I0319 09:43:21.989202 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw\" (UID: \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:21.989593 master-0 kubenswrapper[13205]: I0319 09:43:21.989535 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw\" (UID: \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:21.990287 master-0 kubenswrapper[13205]: I0319 09:43:21.990244 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw\" (UID: \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:22.007215 master-0 kubenswrapper[13205]: I0319 09:43:22.007163 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdpcl\" (UniqueName: \"kubernetes.io/projected/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-kube-api-access-xdpcl\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw\" (UID: \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:22.069604 master-0 kubenswrapper[13205]: I0319 09:43:22.069493 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:22.256013 master-0 kubenswrapper[13205]: I0319 09:43:22.255913 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7"] Mar 19 09:43:22.259870 master-0 kubenswrapper[13205]: W0319 09:43:22.259747 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddea5f323_0902_4f06_a00d_2e60ddd72d84.slice/crio-351d6e830d5dd70ce8f2fce8a8ece7d3db13e90694544b2fa0de1ae3d2b3725a WatchSource:0}: Error finding container 351d6e830d5dd70ce8f2fce8a8ece7d3db13e90694544b2fa0de1ae3d2b3725a: Status 404 returned error can't find the container with id 351d6e830d5dd70ce8f2fce8a8ece7d3db13e90694544b2fa0de1ae3d2b3725a Mar 19 09:43:22.681483 master-0 kubenswrapper[13205]: I0319 09:43:22.681378 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw"] Mar 19 09:43:23.107932 master-0 kubenswrapper[13205]: I0319 09:43:23.107853 13205 generic.go:334] "Generic (PLEG): container finished" podID="7a042d5e-e45b-4b6a-a9ec-6c476129b86c" containerID="4cf3c27bdda782dce7aa36688f1e6e62440cd0a0b0e1a1bfcf264a780300d2f3" exitCode=0 Mar 19 09:43:23.108796 master-0 kubenswrapper[13205]: I0319 09:43:23.107946 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" event={"ID":"7a042d5e-e45b-4b6a-a9ec-6c476129b86c","Type":"ContainerDied","Data":"4cf3c27bdda782dce7aa36688f1e6e62440cd0a0b0e1a1bfcf264a780300d2f3"} Mar 19 09:43:23.108796 master-0 kubenswrapper[13205]: I0319 09:43:23.107992 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" event={"ID":"7a042d5e-e45b-4b6a-a9ec-6c476129b86c","Type":"ContainerStarted","Data":"f6cdf4c65b712e3e5f71e738c92cf50c6ffc5e87bbc9b1e6fe3b5770aca29d15"} Mar 19 09:43:23.110226 master-0 kubenswrapper[13205]: I0319 09:43:23.110174 13205 generic.go:334] "Generic (PLEG): container finished" podID="dea5f323-0902-4f06-a00d-2e60ddd72d84" containerID="8aa61ff189e7453ca949c5e185084de71d41af8d4dd0b0434a207c182707e200" exitCode=0 Mar 19 09:43:23.110226 master-0 kubenswrapper[13205]: I0319 09:43:23.110216 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" event={"ID":"dea5f323-0902-4f06-a00d-2e60ddd72d84","Type":"ContainerDied","Data":"8aa61ff189e7453ca949c5e185084de71d41af8d4dd0b0434a207c182707e200"} Mar 19 09:43:23.110432 master-0 kubenswrapper[13205]: I0319 09:43:23.110237 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" event={"ID":"dea5f323-0902-4f06-a00d-2e60ddd72d84","Type":"ContainerStarted","Data":"351d6e830d5dd70ce8f2fce8a8ece7d3db13e90694544b2fa0de1ae3d2b3725a"} Mar 19 09:43:23.992731 master-0 kubenswrapper[13205]: I0319 09:43:23.992602 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz"] Mar 19 09:43:23.999070 master-0 kubenswrapper[13205]: I0319 09:43:23.999020 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:23.999806 master-0 kubenswrapper[13205]: I0319 09:43:23.999764 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz"] Mar 19 09:43:24.022350 master-0 kubenswrapper[13205]: I0319 09:43:24.022304 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0603a771-781d-4522-9756-cbf1631640e1-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz\" (UID: \"0603a771-781d-4522-9756-cbf1631640e1\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:24.023107 master-0 kubenswrapper[13205]: I0319 09:43:24.022910 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-526qk\" (UniqueName: \"kubernetes.io/projected/0603a771-781d-4522-9756-cbf1631640e1-kube-api-access-526qk\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz\" (UID: \"0603a771-781d-4522-9756-cbf1631640e1\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:24.023107 master-0 kubenswrapper[13205]: I0319 09:43:24.023032 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0603a771-781d-4522-9756-cbf1631640e1-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz\" (UID: \"0603a771-781d-4522-9756-cbf1631640e1\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:24.125138 master-0 kubenswrapper[13205]: I0319 09:43:24.123937 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0603a771-781d-4522-9756-cbf1631640e1-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz\" (UID: \"0603a771-781d-4522-9756-cbf1631640e1\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:24.125138 master-0 kubenswrapper[13205]: I0319 09:43:24.124056 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-526qk\" (UniqueName: \"kubernetes.io/projected/0603a771-781d-4522-9756-cbf1631640e1-kube-api-access-526qk\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz\" (UID: \"0603a771-781d-4522-9756-cbf1631640e1\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:24.125138 master-0 kubenswrapper[13205]: I0319 09:43:24.124095 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0603a771-781d-4522-9756-cbf1631640e1-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz\" (UID: \"0603a771-781d-4522-9756-cbf1631640e1\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:24.125138 master-0 kubenswrapper[13205]: I0319 09:43:24.124708 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0603a771-781d-4522-9756-cbf1631640e1-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz\" (UID: \"0603a771-781d-4522-9756-cbf1631640e1\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:24.125138 master-0 kubenswrapper[13205]: I0319 09:43:24.125039 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0603a771-781d-4522-9756-cbf1631640e1-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz\" (UID: \"0603a771-781d-4522-9756-cbf1631640e1\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:24.144002 master-0 kubenswrapper[13205]: I0319 09:43:24.143624 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-526qk\" (UniqueName: \"kubernetes.io/projected/0603a771-781d-4522-9756-cbf1631640e1-kube-api-access-526qk\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz\" (UID: \"0603a771-781d-4522-9756-cbf1631640e1\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:24.319131 master-0 kubenswrapper[13205]: I0319 09:43:24.319069 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:25.672129 master-0 kubenswrapper[13205]: I0319 09:43:25.672023 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz"] Mar 19 09:43:26.158795 master-0 kubenswrapper[13205]: W0319 09:43:26.158727 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0603a771_781d_4522_9756_cbf1631640e1.slice/crio-04aba79421f28e764b56c2ff42b858c333cce704a8d424a86852a9b7bc717e08 WatchSource:0}: Error finding container 04aba79421f28e764b56c2ff42b858c333cce704a8d424a86852a9b7bc717e08: Status 404 returned error can't find the container with id 04aba79421f28e764b56c2ff42b858c333cce704a8d424a86852a9b7bc717e08 Mar 19 09:43:27.145881 master-0 kubenswrapper[13205]: I0319 09:43:27.145815 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" event={"ID":"dea5f323-0902-4f06-a00d-2e60ddd72d84","Type":"ContainerStarted","Data":"bdc663f9fb9b4eef235d1561dce9a1fd7fc213407fbc9a3ca7639048a2be1f58"} Mar 19 09:43:27.147956 master-0 kubenswrapper[13205]: I0319 09:43:27.147920 13205 generic.go:334] "Generic (PLEG): container finished" podID="0603a771-781d-4522-9756-cbf1631640e1" containerID="6ebd3b3a1b02d29c6032c4f8bada5ca1fc77bc3712886482acf1a93181d3aba2" exitCode=0 Mar 19 09:43:27.148124 master-0 kubenswrapper[13205]: I0319 09:43:27.147963 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" event={"ID":"0603a771-781d-4522-9756-cbf1631640e1","Type":"ContainerDied","Data":"6ebd3b3a1b02d29c6032c4f8bada5ca1fc77bc3712886482acf1a93181d3aba2"} Mar 19 09:43:27.148264 master-0 kubenswrapper[13205]: I0319 09:43:27.148232 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" event={"ID":"0603a771-781d-4522-9756-cbf1631640e1","Type":"ContainerStarted","Data":"04aba79421f28e764b56c2ff42b858c333cce704a8d424a86852a9b7bc717e08"} Mar 19 09:43:27.150811 master-0 kubenswrapper[13205]: I0319 09:43:27.150767 13205 generic.go:334] "Generic (PLEG): container finished" podID="7a042d5e-e45b-4b6a-a9ec-6c476129b86c" containerID="b446921051435211e5bf2ba2aadd7ee7cc4126ce6e5386c0285d2e3336b3319a" exitCode=0 Mar 19 09:43:27.150888 master-0 kubenswrapper[13205]: I0319 09:43:27.150817 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" event={"ID":"7a042d5e-e45b-4b6a-a9ec-6c476129b86c","Type":"ContainerDied","Data":"b446921051435211e5bf2ba2aadd7ee7cc4126ce6e5386c0285d2e3336b3319a"} Mar 19 09:43:28.160473 master-0 kubenswrapper[13205]: I0319 09:43:28.160384 13205 generic.go:334] "Generic (PLEG): container finished" podID="dea5f323-0902-4f06-a00d-2e60ddd72d84" containerID="bdc663f9fb9b4eef235d1561dce9a1fd7fc213407fbc9a3ca7639048a2be1f58" exitCode=0 Mar 19 09:43:28.160473 master-0 kubenswrapper[13205]: I0319 09:43:28.160433 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" event={"ID":"dea5f323-0902-4f06-a00d-2e60ddd72d84","Type":"ContainerDied","Data":"bdc663f9fb9b4eef235d1561dce9a1fd7fc213407fbc9a3ca7639048a2be1f58"} Mar 19 09:43:28.167872 master-0 kubenswrapper[13205]: I0319 09:43:28.164671 13205 generic.go:334] "Generic (PLEG): container finished" podID="7a042d5e-e45b-4b6a-a9ec-6c476129b86c" containerID="b355e666e1c683ac01242b8a3d2971fe1a93dce31e8be112737c5bc8984783e5" exitCode=0 Mar 19 09:43:28.167872 master-0 kubenswrapper[13205]: I0319 09:43:28.164716 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" event={"ID":"7a042d5e-e45b-4b6a-a9ec-6c476129b86c","Type":"ContainerDied","Data":"b355e666e1c683ac01242b8a3d2971fe1a93dce31e8be112737c5bc8984783e5"} Mar 19 09:43:29.179499 master-0 kubenswrapper[13205]: I0319 09:43:29.179386 13205 generic.go:334] "Generic (PLEG): container finished" podID="0603a771-781d-4522-9756-cbf1631640e1" containerID="fd9bd5d010de7c3daf0e2c3da13428206e6a325b6b115e0d00fd11a108dce97f" exitCode=0 Mar 19 09:43:29.180521 master-0 kubenswrapper[13205]: I0319 09:43:29.179517 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" event={"ID":"0603a771-781d-4522-9756-cbf1631640e1","Type":"ContainerDied","Data":"fd9bd5d010de7c3daf0e2c3da13428206e6a325b6b115e0d00fd11a108dce97f"} Mar 19 09:43:29.196691 master-0 kubenswrapper[13205]: I0319 09:43:29.196622 13205 generic.go:334] "Generic (PLEG): container finished" podID="dea5f323-0902-4f06-a00d-2e60ddd72d84" containerID="98b80e166d4ead480b36adcabeb9d9d5fe0b8b9630da066d75264936b3f34897" exitCode=0 Mar 19 09:43:29.196938 master-0 kubenswrapper[13205]: I0319 09:43:29.196777 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" event={"ID":"dea5f323-0902-4f06-a00d-2e60ddd72d84","Type":"ContainerDied","Data":"98b80e166d4ead480b36adcabeb9d9d5fe0b8b9630da066d75264936b3f34897"} Mar 19 09:43:29.705340 master-0 kubenswrapper[13205]: I0319 09:43:29.705286 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:29.734365 master-0 kubenswrapper[13205]: I0319 09:43:29.734296 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdpcl\" (UniqueName: \"kubernetes.io/projected/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-kube-api-access-xdpcl\") pod \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\" (UID: \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\") " Mar 19 09:43:29.734599 master-0 kubenswrapper[13205]: I0319 09:43:29.734479 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-util\") pod \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\" (UID: \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\") " Mar 19 09:43:29.734722 master-0 kubenswrapper[13205]: I0319 09:43:29.734619 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-bundle\") pod \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\" (UID: \"7a042d5e-e45b-4b6a-a9ec-6c476129b86c\") " Mar 19 09:43:29.735702 master-0 kubenswrapper[13205]: I0319 09:43:29.735642 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-bundle" (OuterVolumeSpecName: "bundle") pod "7a042d5e-e45b-4b6a-a9ec-6c476129b86c" (UID: "7a042d5e-e45b-4b6a-a9ec-6c476129b86c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:43:29.742588 master-0 kubenswrapper[13205]: I0319 09:43:29.739231 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-kube-api-access-xdpcl" (OuterVolumeSpecName: "kube-api-access-xdpcl") pod "7a042d5e-e45b-4b6a-a9ec-6c476129b86c" (UID: "7a042d5e-e45b-4b6a-a9ec-6c476129b86c"). InnerVolumeSpecName "kube-api-access-xdpcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:43:29.747980 master-0 kubenswrapper[13205]: I0319 09:43:29.747925 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-util" (OuterVolumeSpecName: "util") pod "7a042d5e-e45b-4b6a-a9ec-6c476129b86c" (UID: "7a042d5e-e45b-4b6a-a9ec-6c476129b86c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:43:29.837223 master-0 kubenswrapper[13205]: I0319 09:43:29.837058 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdpcl\" (UniqueName: \"kubernetes.io/projected/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-kube-api-access-xdpcl\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:29.837223 master-0 kubenswrapper[13205]: I0319 09:43:29.837120 13205 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-util\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:29.837223 master-0 kubenswrapper[13205]: I0319 09:43:29.837136 13205 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a042d5e-e45b-4b6a-a9ec-6c476129b86c-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:30.209449 master-0 kubenswrapper[13205]: I0319 09:43:30.209278 13205 generic.go:334] "Generic (PLEG): container finished" podID="0603a771-781d-4522-9756-cbf1631640e1" containerID="d0c395966849e92f8c103372f3b27143d0fb89654a0a21e3407c6f542f383dcc" exitCode=0 Mar 19 09:43:30.209449 master-0 kubenswrapper[13205]: I0319 09:43:30.209412 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" event={"ID":"0603a771-781d-4522-9756-cbf1631640e1","Type":"ContainerDied","Data":"d0c395966849e92f8c103372f3b27143d0fb89654a0a21e3407c6f542f383dcc"} Mar 19 09:43:30.213924 master-0 kubenswrapper[13205]: I0319 09:43:30.213873 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" event={"ID":"7a042d5e-e45b-4b6a-a9ec-6c476129b86c","Type":"ContainerDied","Data":"f6cdf4c65b712e3e5f71e738c92cf50c6ffc5e87bbc9b1e6fe3b5770aca29d15"} Mar 19 09:43:30.214109 master-0 kubenswrapper[13205]: I0319 09:43:30.213977 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6cdf4c65b712e3e5f71e738c92cf50c6ffc5e87bbc9b1e6fe3b5770aca29d15" Mar 19 09:43:30.214109 master-0 kubenswrapper[13205]: I0319 09:43:30.214013 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1hdwsw" Mar 19 09:43:30.703249 master-0 kubenswrapper[13205]: I0319 09:43:30.703207 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:30.750182 master-0 kubenswrapper[13205]: I0319 09:43:30.750099 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dea5f323-0902-4f06-a00d-2e60ddd72d84-bundle\") pod \"dea5f323-0902-4f06-a00d-2e60ddd72d84\" (UID: \"dea5f323-0902-4f06-a00d-2e60ddd72d84\") " Mar 19 09:43:30.750420 master-0 kubenswrapper[13205]: I0319 09:43:30.750269 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dea5f323-0902-4f06-a00d-2e60ddd72d84-util\") pod \"dea5f323-0902-4f06-a00d-2e60ddd72d84\" (UID: \"dea5f323-0902-4f06-a00d-2e60ddd72d84\") " Mar 19 09:43:30.750420 master-0 kubenswrapper[13205]: I0319 09:43:30.750324 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5wjd\" (UniqueName: \"kubernetes.io/projected/dea5f323-0902-4f06-a00d-2e60ddd72d84-kube-api-access-z5wjd\") pod \"dea5f323-0902-4f06-a00d-2e60ddd72d84\" (UID: \"dea5f323-0902-4f06-a00d-2e60ddd72d84\") " Mar 19 09:43:30.752966 master-0 kubenswrapper[13205]: I0319 09:43:30.752899 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dea5f323-0902-4f06-a00d-2e60ddd72d84-bundle" (OuterVolumeSpecName: "bundle") pod "dea5f323-0902-4f06-a00d-2e60ddd72d84" (UID: "dea5f323-0902-4f06-a00d-2e60ddd72d84"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:43:30.774575 master-0 kubenswrapper[13205]: I0319 09:43:30.757870 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dea5f323-0902-4f06-a00d-2e60ddd72d84-kube-api-access-z5wjd" (OuterVolumeSpecName: "kube-api-access-z5wjd") pod "dea5f323-0902-4f06-a00d-2e60ddd72d84" (UID: "dea5f323-0902-4f06-a00d-2e60ddd72d84"). InnerVolumeSpecName "kube-api-access-z5wjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:43:30.774575 master-0 kubenswrapper[13205]: I0319 09:43:30.768177 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dea5f323-0902-4f06-a00d-2e60ddd72d84-util" (OuterVolumeSpecName: "util") pod "dea5f323-0902-4f06-a00d-2e60ddd72d84" (UID: "dea5f323-0902-4f06-a00d-2e60ddd72d84"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:43:30.852258 master-0 kubenswrapper[13205]: I0319 09:43:30.852078 13205 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dea5f323-0902-4f06-a00d-2e60ddd72d84-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:30.852258 master-0 kubenswrapper[13205]: I0319 09:43:30.852124 13205 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dea5f323-0902-4f06-a00d-2e60ddd72d84-util\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:30.852258 master-0 kubenswrapper[13205]: I0319 09:43:30.852138 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5wjd\" (UniqueName: \"kubernetes.io/projected/dea5f323-0902-4f06-a00d-2e60ddd72d84-kube-api-access-z5wjd\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:31.226800 master-0 kubenswrapper[13205]: I0319 09:43:31.226592 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" event={"ID":"dea5f323-0902-4f06-a00d-2e60ddd72d84","Type":"ContainerDied","Data":"351d6e830d5dd70ce8f2fce8a8ece7d3db13e90694544b2fa0de1ae3d2b3725a"} Mar 19 09:43:31.226800 master-0 kubenswrapper[13205]: I0319 09:43:31.226688 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="351d6e830d5dd70ce8f2fce8a8ece7d3db13e90694544b2fa0de1ae3d2b3725a" Mar 19 09:43:31.227729 master-0 kubenswrapper[13205]: I0319 09:43:31.227596 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xklw7" Mar 19 09:43:31.662050 master-0 kubenswrapper[13205]: I0319 09:43:31.661976 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:31.768884 master-0 kubenswrapper[13205]: I0319 09:43:31.767968 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0603a771-781d-4522-9756-cbf1631640e1-util\") pod \"0603a771-781d-4522-9756-cbf1631640e1\" (UID: \"0603a771-781d-4522-9756-cbf1631640e1\") " Mar 19 09:43:31.769149 master-0 kubenswrapper[13205]: I0319 09:43:31.769018 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-526qk\" (UniqueName: \"kubernetes.io/projected/0603a771-781d-4522-9756-cbf1631640e1-kube-api-access-526qk\") pod \"0603a771-781d-4522-9756-cbf1631640e1\" (UID: \"0603a771-781d-4522-9756-cbf1631640e1\") " Mar 19 09:43:31.769326 master-0 kubenswrapper[13205]: I0319 09:43:31.769283 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0603a771-781d-4522-9756-cbf1631640e1-bundle\") pod \"0603a771-781d-4522-9756-cbf1631640e1\" (UID: \"0603a771-781d-4522-9756-cbf1631640e1\") " Mar 19 09:43:31.770157 master-0 kubenswrapper[13205]: I0319 09:43:31.770111 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0603a771-781d-4522-9756-cbf1631640e1-bundle" (OuterVolumeSpecName: "bundle") pod "0603a771-781d-4522-9756-cbf1631640e1" (UID: "0603a771-781d-4522-9756-cbf1631640e1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:43:31.772466 master-0 kubenswrapper[13205]: I0319 09:43:31.772374 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0603a771-781d-4522-9756-cbf1631640e1-kube-api-access-526qk" (OuterVolumeSpecName: "kube-api-access-526qk") pod "0603a771-781d-4522-9756-cbf1631640e1" (UID: "0603a771-781d-4522-9756-cbf1631640e1"). InnerVolumeSpecName "kube-api-access-526qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:43:31.795716 master-0 kubenswrapper[13205]: I0319 09:43:31.795626 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0603a771-781d-4522-9756-cbf1631640e1-util" (OuterVolumeSpecName: "util") pod "0603a771-781d-4522-9756-cbf1631640e1" (UID: "0603a771-781d-4522-9756-cbf1631640e1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:43:31.872468 master-0 kubenswrapper[13205]: I0319 09:43:31.872309 13205 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0603a771-781d-4522-9756-cbf1631640e1-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:31.872468 master-0 kubenswrapper[13205]: I0319 09:43:31.872347 13205 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0603a771-781d-4522-9756-cbf1631640e1-util\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:31.872468 master-0 kubenswrapper[13205]: I0319 09:43:31.872362 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-526qk\" (UniqueName: \"kubernetes.io/projected/0603a771-781d-4522-9756-cbf1631640e1-kube-api-access-526qk\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:32.240297 master-0 kubenswrapper[13205]: I0319 09:43:32.240115 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" event={"ID":"0603a771-781d-4522-9756-cbf1631640e1","Type":"ContainerDied","Data":"04aba79421f28e764b56c2ff42b858c333cce704a8d424a86852a9b7bc717e08"} Mar 19 09:43:32.240297 master-0 kubenswrapper[13205]: I0319 09:43:32.240156 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04aba79421f28e764b56c2ff42b858c333cce704a8d424a86852a9b7bc717e08" Mar 19 09:43:32.240297 master-0 kubenswrapper[13205]: I0319 09:43:32.240196 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q5wdz" Mar 19 09:43:32.903591 master-0 kubenswrapper[13205]: I0319 09:43:32.903459 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq"] Mar 19 09:43:32.903814 master-0 kubenswrapper[13205]: E0319 09:43:32.903790 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea5f323-0902-4f06-a00d-2e60ddd72d84" containerName="extract" Mar 19 09:43:32.903814 master-0 kubenswrapper[13205]: I0319 09:43:32.903801 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea5f323-0902-4f06-a00d-2e60ddd72d84" containerName="extract" Mar 19 09:43:32.903885 master-0 kubenswrapper[13205]: E0319 09:43:32.903817 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0603a771-781d-4522-9756-cbf1631640e1" containerName="extract" Mar 19 09:43:32.903885 master-0 kubenswrapper[13205]: I0319 09:43:32.903823 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="0603a771-781d-4522-9756-cbf1631640e1" containerName="extract" Mar 19 09:43:32.903885 master-0 kubenswrapper[13205]: E0319 09:43:32.903837 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a042d5e-e45b-4b6a-a9ec-6c476129b86c" containerName="extract" Mar 19 09:43:32.903885 master-0 kubenswrapper[13205]: I0319 09:43:32.903844 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a042d5e-e45b-4b6a-a9ec-6c476129b86c" containerName="extract" Mar 19 09:43:32.903885 master-0 kubenswrapper[13205]: E0319 09:43:32.903861 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a042d5e-e45b-4b6a-a9ec-6c476129b86c" containerName="util" Mar 19 09:43:32.903885 master-0 kubenswrapper[13205]: I0319 09:43:32.903867 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a042d5e-e45b-4b6a-a9ec-6c476129b86c" containerName="util" Mar 19 09:43:32.903885 master-0 kubenswrapper[13205]: E0319 09:43:32.903876 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea5f323-0902-4f06-a00d-2e60ddd72d84" containerName="util" Mar 19 09:43:32.903885 master-0 kubenswrapper[13205]: I0319 09:43:32.903884 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea5f323-0902-4f06-a00d-2e60ddd72d84" containerName="util" Mar 19 09:43:32.904180 master-0 kubenswrapper[13205]: E0319 09:43:32.903896 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a042d5e-e45b-4b6a-a9ec-6c476129b86c" containerName="pull" Mar 19 09:43:32.904180 master-0 kubenswrapper[13205]: I0319 09:43:32.903902 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a042d5e-e45b-4b6a-a9ec-6c476129b86c" containerName="pull" Mar 19 09:43:32.904180 master-0 kubenswrapper[13205]: E0319 09:43:32.903916 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0603a771-781d-4522-9756-cbf1631640e1" containerName="util" Mar 19 09:43:32.904180 master-0 kubenswrapper[13205]: I0319 09:43:32.903921 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="0603a771-781d-4522-9756-cbf1631640e1" containerName="util" Mar 19 09:43:32.904180 master-0 kubenswrapper[13205]: E0319 09:43:32.903933 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0603a771-781d-4522-9756-cbf1631640e1" containerName="pull" Mar 19 09:43:32.904180 master-0 kubenswrapper[13205]: I0319 09:43:32.903938 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="0603a771-781d-4522-9756-cbf1631640e1" containerName="pull" Mar 19 09:43:32.904180 master-0 kubenswrapper[13205]: E0319 09:43:32.903947 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea5f323-0902-4f06-a00d-2e60ddd72d84" containerName="pull" Mar 19 09:43:32.904180 master-0 kubenswrapper[13205]: I0319 09:43:32.903954 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea5f323-0902-4f06-a00d-2e60ddd72d84" containerName="pull" Mar 19 09:43:32.904180 master-0 kubenswrapper[13205]: I0319 09:43:32.904086 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a042d5e-e45b-4b6a-a9ec-6c476129b86c" containerName="extract" Mar 19 09:43:32.904180 master-0 kubenswrapper[13205]: I0319 09:43:32.904112 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea5f323-0902-4f06-a00d-2e60ddd72d84" containerName="extract" Mar 19 09:43:32.904180 master-0 kubenswrapper[13205]: I0319 09:43:32.904122 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="0603a771-781d-4522-9756-cbf1631640e1" containerName="extract" Mar 19 09:43:32.905009 master-0 kubenswrapper[13205]: I0319 09:43:32.904986 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:32.923214 master-0 kubenswrapper[13205]: I0319 09:43:32.923159 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq"] Mar 19 09:43:32.998547 master-0 kubenswrapper[13205]: I0319 09:43:32.997681 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq\" (UID: \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:32.998547 master-0 kubenswrapper[13205]: I0319 09:43:32.997742 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq\" (UID: \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:32.998547 master-0 kubenswrapper[13205]: I0319 09:43:32.997784 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ctlg\" (UniqueName: \"kubernetes.io/projected/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-kube-api-access-8ctlg\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq\" (UID: \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:33.099280 master-0 kubenswrapper[13205]: I0319 09:43:33.099222 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq\" (UID: \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:33.099513 master-0 kubenswrapper[13205]: I0319 09:43:33.099301 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ctlg\" (UniqueName: \"kubernetes.io/projected/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-kube-api-access-8ctlg\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq\" (UID: \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:33.099513 master-0 kubenswrapper[13205]: I0319 09:43:33.099379 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq\" (UID: \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:33.099891 master-0 kubenswrapper[13205]: I0319 09:43:33.099853 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq\" (UID: \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:33.099891 master-0 kubenswrapper[13205]: I0319 09:43:33.099848 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq\" (UID: \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:33.132292 master-0 kubenswrapper[13205]: I0319 09:43:33.132244 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ctlg\" (UniqueName: \"kubernetes.io/projected/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-kube-api-access-8ctlg\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq\" (UID: \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:33.221785 master-0 kubenswrapper[13205]: I0319 09:43:33.221652 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:33.815243 master-0 kubenswrapper[13205]: I0319 09:43:33.814644 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq"] Mar 19 09:43:34.260242 master-0 kubenswrapper[13205]: I0319 09:43:34.260154 13205 generic.go:334] "Generic (PLEG): container finished" podID="96452d7e-c16a-4dbe-ae9d-ccc1f8473588" containerID="3d9de22661cbf1b1a6e582b21da471268074705945a6fe5a22473c57307fddad" exitCode=0 Mar 19 09:43:34.260242 master-0 kubenswrapper[13205]: I0319 09:43:34.260209 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" event={"ID":"96452d7e-c16a-4dbe-ae9d-ccc1f8473588","Type":"ContainerDied","Data":"3d9de22661cbf1b1a6e582b21da471268074705945a6fe5a22473c57307fddad"} Mar 19 09:43:34.260242 master-0 kubenswrapper[13205]: I0319 09:43:34.260238 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" event={"ID":"96452d7e-c16a-4dbe-ae9d-ccc1f8473588","Type":"ContainerStarted","Data":"6c76530183136dbbc3e56c9c92bdca6666ccc0f900a9ec92b433641079d507b1"} Mar 19 09:43:35.213884 master-0 kubenswrapper[13205]: E0319 09:43:35.213810 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:43:36.091134 master-0 kubenswrapper[13205]: I0319 09:43:36.091060 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv"] Mar 19 09:43:36.091889 master-0 kubenswrapper[13205]: I0319 09:43:36.091860 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv" Mar 19 09:43:36.093698 master-0 kubenswrapper[13205]: I0319 09:43:36.093662 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 19 09:43:36.093785 master-0 kubenswrapper[13205]: I0319 09:43:36.093693 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 19 09:43:36.109975 master-0 kubenswrapper[13205]: I0319 09:43:36.109917 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv"] Mar 19 09:43:36.152292 master-0 kubenswrapper[13205]: I0319 09:43:36.152231 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zttfz\" (UniqueName: \"kubernetes.io/projected/01832dab-4972-4f72-8ef4-4d2dae6b585d-kube-api-access-zttfz\") pod \"cert-manager-operator-controller-manager-66c8bdd694-p47bv\" (UID: \"01832dab-4972-4f72-8ef4-4d2dae6b585d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv" Mar 19 09:43:36.152513 master-0 kubenswrapper[13205]: I0319 09:43:36.152336 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/01832dab-4972-4f72-8ef4-4d2dae6b585d-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-p47bv\" (UID: \"01832dab-4972-4f72-8ef4-4d2dae6b585d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv" Mar 19 09:43:36.254549 master-0 kubenswrapper[13205]: I0319 09:43:36.253890 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zttfz\" (UniqueName: \"kubernetes.io/projected/01832dab-4972-4f72-8ef4-4d2dae6b585d-kube-api-access-zttfz\") pod \"cert-manager-operator-controller-manager-66c8bdd694-p47bv\" (UID: \"01832dab-4972-4f72-8ef4-4d2dae6b585d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv" Mar 19 09:43:36.254549 master-0 kubenswrapper[13205]: I0319 09:43:36.253979 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/01832dab-4972-4f72-8ef4-4d2dae6b585d-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-p47bv\" (UID: \"01832dab-4972-4f72-8ef4-4d2dae6b585d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv" Mar 19 09:43:36.254549 master-0 kubenswrapper[13205]: I0319 09:43:36.254419 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/01832dab-4972-4f72-8ef4-4d2dae6b585d-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-p47bv\" (UID: \"01832dab-4972-4f72-8ef4-4d2dae6b585d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv" Mar 19 09:43:36.293550 master-0 kubenswrapper[13205]: I0319 09:43:36.292622 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zttfz\" (UniqueName: \"kubernetes.io/projected/01832dab-4972-4f72-8ef4-4d2dae6b585d-kube-api-access-zttfz\") pod \"cert-manager-operator-controller-manager-66c8bdd694-p47bv\" (UID: \"01832dab-4972-4f72-8ef4-4d2dae6b585d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv" Mar 19 09:43:36.300544 master-0 kubenswrapper[13205]: I0319 09:43:36.299856 13205 generic.go:334] "Generic (PLEG): container finished" podID="96452d7e-c16a-4dbe-ae9d-ccc1f8473588" containerID="e724b4af1727ff556822c06b34395189dcb4110e9cb27bce42d5eddbaa410633" exitCode=0 Mar 19 09:43:36.300544 master-0 kubenswrapper[13205]: I0319 09:43:36.299903 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" event={"ID":"96452d7e-c16a-4dbe-ae9d-ccc1f8473588","Type":"ContainerDied","Data":"e724b4af1727ff556822c06b34395189dcb4110e9cb27bce42d5eddbaa410633"} Mar 19 09:43:36.463579 master-0 kubenswrapper[13205]: I0319 09:43:36.463501 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv" Mar 19 09:43:36.940318 master-0 kubenswrapper[13205]: W0319 09:43:36.940268 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01832dab_4972_4f72_8ef4_4d2dae6b585d.slice/crio-2b3598c6c114c44e94428027a069e826de99610b10c41bca04181ec72cae678a WatchSource:0}: Error finding container 2b3598c6c114c44e94428027a069e826de99610b10c41bca04181ec72cae678a: Status 404 returned error can't find the container with id 2b3598c6c114c44e94428027a069e826de99610b10c41bca04181ec72cae678a Mar 19 09:43:36.942365 master-0 kubenswrapper[13205]: I0319 09:43:36.942324 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv"] Mar 19 09:43:37.307187 master-0 kubenswrapper[13205]: I0319 09:43:37.307129 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv" event={"ID":"01832dab-4972-4f72-8ef4-4d2dae6b585d","Type":"ContainerStarted","Data":"2b3598c6c114c44e94428027a069e826de99610b10c41bca04181ec72cae678a"} Mar 19 09:43:37.311602 master-0 kubenswrapper[13205]: I0319 09:43:37.311560 13205 generic.go:334] "Generic (PLEG): container finished" podID="96452d7e-c16a-4dbe-ae9d-ccc1f8473588" containerID="e7a0bf3f5f17f1713a4014f06b17dd1006dd32390d6f60311db31fd2c856f26b" exitCode=0 Mar 19 09:43:37.311602 master-0 kubenswrapper[13205]: I0319 09:43:37.311599 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" event={"ID":"96452d7e-c16a-4dbe-ae9d-ccc1f8473588","Type":"ContainerDied","Data":"e7a0bf3f5f17f1713a4014f06b17dd1006dd32390d6f60311db31fd2c856f26b"} Mar 19 09:43:38.653942 master-0 kubenswrapper[13205]: I0319 09:43:38.653891 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:38.809932 master-0 kubenswrapper[13205]: I0319 09:43:38.809879 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-util\") pod \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\" (UID: \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\") " Mar 19 09:43:38.809932 master-0 kubenswrapper[13205]: I0319 09:43:38.809928 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ctlg\" (UniqueName: \"kubernetes.io/projected/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-kube-api-access-8ctlg\") pod \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\" (UID: \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\") " Mar 19 09:43:38.810168 master-0 kubenswrapper[13205]: I0319 09:43:38.809994 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-bundle\") pod \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\" (UID: \"96452d7e-c16a-4dbe-ae9d-ccc1f8473588\") " Mar 19 09:43:38.812369 master-0 kubenswrapper[13205]: I0319 09:43:38.812335 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-bundle" (OuterVolumeSpecName: "bundle") pod "96452d7e-c16a-4dbe-ae9d-ccc1f8473588" (UID: "96452d7e-c16a-4dbe-ae9d-ccc1f8473588"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:43:38.816671 master-0 kubenswrapper[13205]: I0319 09:43:38.816620 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-kube-api-access-8ctlg" (OuterVolumeSpecName: "kube-api-access-8ctlg") pod "96452d7e-c16a-4dbe-ae9d-ccc1f8473588" (UID: "96452d7e-c16a-4dbe-ae9d-ccc1f8473588"). InnerVolumeSpecName "kube-api-access-8ctlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:43:38.822236 master-0 kubenswrapper[13205]: I0319 09:43:38.822196 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-util" (OuterVolumeSpecName: "util") pod "96452d7e-c16a-4dbe-ae9d-ccc1f8473588" (UID: "96452d7e-c16a-4dbe-ae9d-ccc1f8473588"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:43:38.912342 master-0 kubenswrapper[13205]: I0319 09:43:38.912271 13205 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-util\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:38.912342 master-0 kubenswrapper[13205]: I0319 09:43:38.912316 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ctlg\" (UniqueName: \"kubernetes.io/projected/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-kube-api-access-8ctlg\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:38.912342 master-0 kubenswrapper[13205]: I0319 09:43:38.912332 13205 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96452d7e-c16a-4dbe-ae9d-ccc1f8473588-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:43:39.347562 master-0 kubenswrapper[13205]: I0319 09:43:39.340651 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" event={"ID":"96452d7e-c16a-4dbe-ae9d-ccc1f8473588","Type":"ContainerDied","Data":"6c76530183136dbbc3e56c9c92bdca6666ccc0f900a9ec92b433641079d507b1"} Mar 19 09:43:39.347562 master-0 kubenswrapper[13205]: I0319 09:43:39.340704 13205 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c76530183136dbbc3e56c9c92bdca6666ccc0f900a9ec92b433641079d507b1" Mar 19 09:43:39.347562 master-0 kubenswrapper[13205]: I0319 09:43:39.340824 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726px5gq" Mar 19 09:43:43.375732 master-0 kubenswrapper[13205]: I0319 09:43:43.375652 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv" event={"ID":"01832dab-4972-4f72-8ef4-4d2dae6b585d","Type":"ContainerStarted","Data":"8ab712d1b18553debe8f9dc1eacd449957e0372027a855b084db8a3b087661c9"} Mar 19 09:43:43.533917 master-0 kubenswrapper[13205]: I0319 09:43:43.533792 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-p47bv" podStartSLOduration=1.824000209 podStartE2EDuration="7.533764311s" podCreationTimestamp="2026-03-19 09:43:36 +0000 UTC" firstStartedPulling="2026-03-19 09:43:36.945164853 +0000 UTC m=+1202.277471741" lastFinishedPulling="2026-03-19 09:43:42.654928955 +0000 UTC m=+1207.987235843" observedRunningTime="2026-03-19 09:43:43.520405967 +0000 UTC m=+1208.852712875" watchObservedRunningTime="2026-03-19 09:43:43.533764311 +0000 UTC m=+1208.866071199" Mar 19 09:43:47.397661 master-0 kubenswrapper[13205]: I0319 09:43:47.397414 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-ntbld"] Mar 19 09:43:47.398998 master-0 kubenswrapper[13205]: E0319 09:43:47.397982 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96452d7e-c16a-4dbe-ae9d-ccc1f8473588" containerName="util" Mar 19 09:43:47.398998 master-0 kubenswrapper[13205]: I0319 09:43:47.398012 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="96452d7e-c16a-4dbe-ae9d-ccc1f8473588" containerName="util" Mar 19 09:43:47.398998 master-0 kubenswrapper[13205]: E0319 09:43:47.398075 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96452d7e-c16a-4dbe-ae9d-ccc1f8473588" containerName="pull" Mar 19 09:43:47.398998 master-0 kubenswrapper[13205]: I0319 09:43:47.398093 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="96452d7e-c16a-4dbe-ae9d-ccc1f8473588" containerName="pull" Mar 19 09:43:47.398998 master-0 kubenswrapper[13205]: E0319 09:43:47.398129 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96452d7e-c16a-4dbe-ae9d-ccc1f8473588" containerName="extract" Mar 19 09:43:47.398998 master-0 kubenswrapper[13205]: I0319 09:43:47.398149 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="96452d7e-c16a-4dbe-ae9d-ccc1f8473588" containerName="extract" Mar 19 09:43:47.398998 master-0 kubenswrapper[13205]: I0319 09:43:47.398492 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="96452d7e-c16a-4dbe-ae9d-ccc1f8473588" containerName="extract" Mar 19 09:43:47.399770 master-0 kubenswrapper[13205]: I0319 09:43:47.399425 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" Mar 19 09:43:47.402340 master-0 kubenswrapper[13205]: I0319 09:43:47.402266 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 09:43:47.403825 master-0 kubenswrapper[13205]: I0319 09:43:47.403745 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 09:43:47.432182 master-0 kubenswrapper[13205]: I0319 09:43:47.432133 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-ntbld"] Mar 19 09:43:47.565172 master-0 kubenswrapper[13205]: I0319 09:43:47.565099 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfba36a4-7184-4ff9-b46f-e196c7252fbf-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-ntbld\" (UID: \"cfba36a4-7184-4ff9-b46f-e196c7252fbf\") " pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" Mar 19 09:43:47.565398 master-0 kubenswrapper[13205]: I0319 09:43:47.565207 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lbk\" (UniqueName: \"kubernetes.io/projected/cfba36a4-7184-4ff9-b46f-e196c7252fbf-kube-api-access-68lbk\") pod \"cert-manager-webhook-6888856db4-ntbld\" (UID: \"cfba36a4-7184-4ff9-b46f-e196c7252fbf\") " pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" Mar 19 09:43:47.666565 master-0 kubenswrapper[13205]: I0319 09:43:47.666420 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68lbk\" (UniqueName: \"kubernetes.io/projected/cfba36a4-7184-4ff9-b46f-e196c7252fbf-kube-api-access-68lbk\") pod \"cert-manager-webhook-6888856db4-ntbld\" (UID: \"cfba36a4-7184-4ff9-b46f-e196c7252fbf\") " pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" Mar 19 09:43:47.666774 master-0 kubenswrapper[13205]: I0319 09:43:47.666640 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfba36a4-7184-4ff9-b46f-e196c7252fbf-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-ntbld\" (UID: \"cfba36a4-7184-4ff9-b46f-e196c7252fbf\") " pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" Mar 19 09:43:47.833177 master-0 kubenswrapper[13205]: I0319 09:43:47.833139 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lbk\" (UniqueName: \"kubernetes.io/projected/cfba36a4-7184-4ff9-b46f-e196c7252fbf-kube-api-access-68lbk\") pod \"cert-manager-webhook-6888856db4-ntbld\" (UID: \"cfba36a4-7184-4ff9-b46f-e196c7252fbf\") " pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" Mar 19 09:43:47.833381 master-0 kubenswrapper[13205]: I0319 09:43:47.833282 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfba36a4-7184-4ff9-b46f-e196c7252fbf-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-ntbld\" (UID: \"cfba36a4-7184-4ff9-b46f-e196c7252fbf\") " pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" Mar 19 09:43:48.032878 master-0 kubenswrapper[13205]: I0319 09:43:48.032809 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" Mar 19 09:43:48.559889 master-0 kubenswrapper[13205]: I0319 09:43:48.559839 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-ntbld"] Mar 19 09:43:48.563625 master-0 kubenswrapper[13205]: W0319 09:43:48.563559 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfba36a4_7184_4ff9_b46f_e196c7252fbf.slice/crio-e60adc48d461e61a284c52933b692085eef93d0e439bd70835939cd9cc2bf33d WatchSource:0}: Error finding container e60adc48d461e61a284c52933b692085eef93d0e439bd70835939cd9cc2bf33d: Status 404 returned error can't find the container with id e60adc48d461e61a284c52933b692085eef93d0e439bd70835939cd9cc2bf33d Mar 19 09:43:48.976087 master-0 kubenswrapper[13205]: I0319 09:43:48.975909 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8wzjf"] Mar 19 09:43:48.977351 master-0 kubenswrapper[13205]: I0319 09:43:48.977295 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-8wzjf" Mar 19 09:43:49.085408 master-0 kubenswrapper[13205]: I0319 09:43:49.085304 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8wzjf"] Mar 19 09:43:49.097501 master-0 kubenswrapper[13205]: I0319 09:43:49.097465 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c953c26-428d-4eb2-b88a-d98a21bb27c2-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8wzjf\" (UID: \"5c953c26-428d-4eb2-b88a-d98a21bb27c2\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8wzjf" Mar 19 09:43:49.097772 master-0 kubenswrapper[13205]: I0319 09:43:49.097742 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grjh6\" (UniqueName: \"kubernetes.io/projected/5c953c26-428d-4eb2-b88a-d98a21bb27c2-kube-api-access-grjh6\") pod \"cert-manager-cainjector-5545bd876-8wzjf\" (UID: \"5c953c26-428d-4eb2-b88a-d98a21bb27c2\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8wzjf" Mar 19 09:43:49.200723 master-0 kubenswrapper[13205]: I0319 09:43:49.200645 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c953c26-428d-4eb2-b88a-d98a21bb27c2-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8wzjf\" (UID: \"5c953c26-428d-4eb2-b88a-d98a21bb27c2\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8wzjf" Mar 19 09:43:49.201182 master-0 kubenswrapper[13205]: I0319 09:43:49.201131 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grjh6\" (UniqueName: \"kubernetes.io/projected/5c953c26-428d-4eb2-b88a-d98a21bb27c2-kube-api-access-grjh6\") pod \"cert-manager-cainjector-5545bd876-8wzjf\" (UID: \"5c953c26-428d-4eb2-b88a-d98a21bb27c2\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8wzjf" Mar 19 09:43:49.392049 master-0 kubenswrapper[13205]: I0319 09:43:49.391971 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c953c26-428d-4eb2-b88a-d98a21bb27c2-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8wzjf\" (UID: \"5c953c26-428d-4eb2-b88a-d98a21bb27c2\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8wzjf" Mar 19 09:43:49.397229 master-0 kubenswrapper[13205]: I0319 09:43:49.397170 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grjh6\" (UniqueName: \"kubernetes.io/projected/5c953c26-428d-4eb2-b88a-d98a21bb27c2-kube-api-access-grjh6\") pod \"cert-manager-cainjector-5545bd876-8wzjf\" (UID: \"5c953c26-428d-4eb2-b88a-d98a21bb27c2\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8wzjf" Mar 19 09:43:49.449204 master-0 kubenswrapper[13205]: I0319 09:43:49.449161 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" event={"ID":"cfba36a4-7184-4ff9-b46f-e196c7252fbf","Type":"ContainerStarted","Data":"e60adc48d461e61a284c52933b692085eef93d0e439bd70835939cd9cc2bf33d"} Mar 19 09:43:49.600020 master-0 kubenswrapper[13205]: I0319 09:43:49.599965 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-8wzjf" Mar 19 09:43:50.177110 master-0 kubenswrapper[13205]: I0319 09:43:50.177017 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8wzjf"] Mar 19 09:43:50.455442 master-0 kubenswrapper[13205]: I0319 09:43:50.455290 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-8wzjf" event={"ID":"5c953c26-428d-4eb2-b88a-d98a21bb27c2","Type":"ContainerStarted","Data":"4b72e3f5652de4122e90aa2f2d249e6270a160785522e685680f0a17226a0489"} Mar 19 09:43:53.723480 master-0 kubenswrapper[13205]: I0319 09:43:53.723409 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-95c7k"] Mar 19 09:43:53.724243 master-0 kubenswrapper[13205]: I0319 09:43:53.724215 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-95c7k" Mar 19 09:43:53.726951 master-0 kubenswrapper[13205]: I0319 09:43:53.726915 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 19 09:43:53.727113 master-0 kubenswrapper[13205]: I0319 09:43:53.727096 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 19 09:43:53.836230 master-0 kubenswrapper[13205]: I0319 09:43:53.836165 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkss8\" (UniqueName: \"kubernetes.io/projected/6ab170f8-577b-4f2c-a8f3-8fe1a5e45274-kube-api-access-kkss8\") pod \"nmstate-operator-796d4cfff4-95c7k\" (UID: \"6ab170f8-577b-4f2c-a8f3-8fe1a5e45274\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-95c7k" Mar 19 09:43:53.879248 master-0 kubenswrapper[13205]: I0319 09:43:53.879190 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-95c7k"] Mar 19 09:43:53.941397 master-0 kubenswrapper[13205]: I0319 09:43:53.940491 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkss8\" (UniqueName: \"kubernetes.io/projected/6ab170f8-577b-4f2c-a8f3-8fe1a5e45274-kube-api-access-kkss8\") pod \"nmstate-operator-796d4cfff4-95c7k\" (UID: \"6ab170f8-577b-4f2c-a8f3-8fe1a5e45274\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-95c7k" Mar 19 09:43:54.064400 master-0 kubenswrapper[13205]: I0319 09:43:54.064349 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkss8\" (UniqueName: \"kubernetes.io/projected/6ab170f8-577b-4f2c-a8f3-8fe1a5e45274-kube-api-access-kkss8\") pod \"nmstate-operator-796d4cfff4-95c7k\" (UID: \"6ab170f8-577b-4f2c-a8f3-8fe1a5e45274\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-95c7k" Mar 19 09:43:54.343622 master-0 kubenswrapper[13205]: I0319 09:43:54.343422 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-95c7k" Mar 19 09:43:56.186953 master-0 kubenswrapper[13205]: I0319 09:43:56.186889 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-95c7k"] Mar 19 09:43:56.223884 master-0 kubenswrapper[13205]: I0319 09:43:56.223795 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql"] Mar 19 09:43:56.225170 master-0 kubenswrapper[13205]: I0319 09:43:56.225127 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:43:56.227082 master-0 kubenswrapper[13205]: I0319 09:43:56.227044 13205 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 19 09:43:56.227174 master-0 kubenswrapper[13205]: I0319 09:43:56.227054 13205 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 19 09:43:56.227174 master-0 kubenswrapper[13205]: I0319 09:43:56.227140 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 19 09:43:56.228976 master-0 kubenswrapper[13205]: I0319 09:43:56.228939 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 19 09:43:56.284456 master-0 kubenswrapper[13205]: I0319 09:43:56.284395 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3b14285-61c6-4760-85bc-64667c85f8af-webhook-cert\") pod \"metallb-operator-controller-manager-54b99f6f6b-hk7ql\" (UID: \"b3b14285-61c6-4760-85bc-64667c85f8af\") " pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:43:56.284757 master-0 kubenswrapper[13205]: I0319 09:43:56.284495 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3b14285-61c6-4760-85bc-64667c85f8af-apiservice-cert\") pod \"metallb-operator-controller-manager-54b99f6f6b-hk7ql\" (UID: \"b3b14285-61c6-4760-85bc-64667c85f8af\") " pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:43:56.284757 master-0 kubenswrapper[13205]: I0319 09:43:56.284554 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz7gc\" (UniqueName: \"kubernetes.io/projected/b3b14285-61c6-4760-85bc-64667c85f8af-kube-api-access-rz7gc\") pod \"metallb-operator-controller-manager-54b99f6f6b-hk7ql\" (UID: \"b3b14285-61c6-4760-85bc-64667c85f8af\") " pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:43:56.290393 master-0 kubenswrapper[13205]: I0319 09:43:56.290338 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql"] Mar 19 09:43:56.362619 master-0 kubenswrapper[13205]: W0319 09:43:56.362561 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ab170f8_577b_4f2c_a8f3_8fe1a5e45274.slice/crio-cc29429e466a5277dabddc4c7d992a4e57698eae13bed9baaf44aebfcd04bc29 WatchSource:0}: Error finding container cc29429e466a5277dabddc4c7d992a4e57698eae13bed9baaf44aebfcd04bc29: Status 404 returned error can't find the container with id cc29429e466a5277dabddc4c7d992a4e57698eae13bed9baaf44aebfcd04bc29 Mar 19 09:43:56.390314 master-0 kubenswrapper[13205]: I0319 09:43:56.390250 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3b14285-61c6-4760-85bc-64667c85f8af-apiservice-cert\") pod \"metallb-operator-controller-manager-54b99f6f6b-hk7ql\" (UID: \"b3b14285-61c6-4760-85bc-64667c85f8af\") " pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:43:56.390314 master-0 kubenswrapper[13205]: I0319 09:43:56.390308 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz7gc\" (UniqueName: \"kubernetes.io/projected/b3b14285-61c6-4760-85bc-64667c85f8af-kube-api-access-rz7gc\") pod \"metallb-operator-controller-manager-54b99f6f6b-hk7ql\" (UID: \"b3b14285-61c6-4760-85bc-64667c85f8af\") " pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:43:56.390679 master-0 kubenswrapper[13205]: I0319 09:43:56.390369 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3b14285-61c6-4760-85bc-64667c85f8af-webhook-cert\") pod \"metallb-operator-controller-manager-54b99f6f6b-hk7ql\" (UID: \"b3b14285-61c6-4760-85bc-64667c85f8af\") " pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:43:56.399260 master-0 kubenswrapper[13205]: I0319 09:43:56.399214 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3b14285-61c6-4760-85bc-64667c85f8af-webhook-cert\") pod \"metallb-operator-controller-manager-54b99f6f6b-hk7ql\" (UID: \"b3b14285-61c6-4760-85bc-64667c85f8af\") " pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:43:56.401540 master-0 kubenswrapper[13205]: I0319 09:43:56.401481 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3b14285-61c6-4760-85bc-64667c85f8af-apiservice-cert\") pod \"metallb-operator-controller-manager-54b99f6f6b-hk7ql\" (UID: \"b3b14285-61c6-4760-85bc-64667c85f8af\") " pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:43:56.514462 master-0 kubenswrapper[13205]: I0319 09:43:56.514338 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz7gc\" (UniqueName: \"kubernetes.io/projected/b3b14285-61c6-4760-85bc-64667c85f8af-kube-api-access-rz7gc\") pod \"metallb-operator-controller-manager-54b99f6f6b-hk7ql\" (UID: \"b3b14285-61c6-4760-85bc-64667c85f8af\") " pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:43:56.525030 master-0 kubenswrapper[13205]: I0319 09:43:56.524971 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-95c7k" event={"ID":"6ab170f8-577b-4f2c-a8f3-8fe1a5e45274","Type":"ContainerStarted","Data":"cc29429e466a5277dabddc4c7d992a4e57698eae13bed9baaf44aebfcd04bc29"} Mar 19 09:43:56.542037 master-0 kubenswrapper[13205]: I0319 09:43:56.541948 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:43:57.064951 master-0 kubenswrapper[13205]: I0319 09:43:57.064364 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz"] Mar 19 09:43:57.067244 master-0 kubenswrapper[13205]: I0319 09:43:57.066093 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:43:57.070061 master-0 kubenswrapper[13205]: I0319 09:43:57.069774 13205 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 19 09:43:57.070061 master-0 kubenswrapper[13205]: I0319 09:43:57.069916 13205 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 09:43:57.082934 master-0 kubenswrapper[13205]: I0319 09:43:57.081596 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz"] Mar 19 09:43:57.157858 master-0 kubenswrapper[13205]: I0319 09:43:57.154378 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql"] Mar 19 09:43:57.161165 master-0 kubenswrapper[13205]: W0319 09:43:57.158122 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3b14285_61c6_4760_85bc_64667c85f8af.slice/crio-0a77a45b32ab879d2fef6e2fdcfa263e7bf3823d0ba4b897d7f24aee6a6b359a WatchSource:0}: Error finding container 0a77a45b32ab879d2fef6e2fdcfa263e7bf3823d0ba4b897d7f24aee6a6b359a: Status 404 returned error can't find the container with id 0a77a45b32ab879d2fef6e2fdcfa263e7bf3823d0ba4b897d7f24aee6a6b359a Mar 19 09:43:57.236755 master-0 kubenswrapper[13205]: I0319 09:43:57.236699 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/47802e4a-72b0-4595-a33e-ca548f695f60-apiservice-cert\") pod \"metallb-operator-webhook-server-69bcd667c-x84zz\" (UID: \"47802e4a-72b0-4595-a33e-ca548f695f60\") " pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:43:57.237356 master-0 kubenswrapper[13205]: I0319 09:43:57.236769 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/47802e4a-72b0-4595-a33e-ca548f695f60-webhook-cert\") pod \"metallb-operator-webhook-server-69bcd667c-x84zz\" (UID: \"47802e4a-72b0-4595-a33e-ca548f695f60\") " pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:43:57.237356 master-0 kubenswrapper[13205]: I0319 09:43:57.236792 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjvsh\" (UniqueName: \"kubernetes.io/projected/47802e4a-72b0-4595-a33e-ca548f695f60-kube-api-access-sjvsh\") pod \"metallb-operator-webhook-server-69bcd667c-x84zz\" (UID: \"47802e4a-72b0-4595-a33e-ca548f695f60\") " pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:43:57.338408 master-0 kubenswrapper[13205]: I0319 09:43:57.338297 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/47802e4a-72b0-4595-a33e-ca548f695f60-apiservice-cert\") pod \"metallb-operator-webhook-server-69bcd667c-x84zz\" (UID: \"47802e4a-72b0-4595-a33e-ca548f695f60\") " pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:43:57.338408 master-0 kubenswrapper[13205]: I0319 09:43:57.338356 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/47802e4a-72b0-4595-a33e-ca548f695f60-webhook-cert\") pod \"metallb-operator-webhook-server-69bcd667c-x84zz\" (UID: \"47802e4a-72b0-4595-a33e-ca548f695f60\") " pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:43:57.338408 master-0 kubenswrapper[13205]: I0319 09:43:57.338381 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvsh\" (UniqueName: \"kubernetes.io/projected/47802e4a-72b0-4595-a33e-ca548f695f60-kube-api-access-sjvsh\") pod \"metallb-operator-webhook-server-69bcd667c-x84zz\" (UID: \"47802e4a-72b0-4595-a33e-ca548f695f60\") " pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:43:57.346554 master-0 kubenswrapper[13205]: I0319 09:43:57.342006 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/47802e4a-72b0-4595-a33e-ca548f695f60-apiservice-cert\") pod \"metallb-operator-webhook-server-69bcd667c-x84zz\" (UID: \"47802e4a-72b0-4595-a33e-ca548f695f60\") " pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:43:57.346554 master-0 kubenswrapper[13205]: I0319 09:43:57.345028 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/47802e4a-72b0-4595-a33e-ca548f695f60-webhook-cert\") pod \"metallb-operator-webhook-server-69bcd667c-x84zz\" (UID: \"47802e4a-72b0-4595-a33e-ca548f695f60\") " pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:43:57.367024 master-0 kubenswrapper[13205]: I0319 09:43:57.366643 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjvsh\" (UniqueName: \"kubernetes.io/projected/47802e4a-72b0-4595-a33e-ca548f695f60-kube-api-access-sjvsh\") pod \"metallb-operator-webhook-server-69bcd667c-x84zz\" (UID: \"47802e4a-72b0-4595-a33e-ca548f695f60\") " pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:43:57.398550 master-0 kubenswrapper[13205]: I0319 09:43:57.397797 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:43:57.543616 master-0 kubenswrapper[13205]: I0319 09:43:57.542960 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" event={"ID":"cfba36a4-7184-4ff9-b46f-e196c7252fbf","Type":"ContainerStarted","Data":"0c0fd81bff09796aa388a1c7f05aac4d7491b01b5d4aa47a5293d61341311cdc"} Mar 19 09:43:57.543829 master-0 kubenswrapper[13205]: I0319 09:43:57.543670 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" Mar 19 09:43:57.551562 master-0 kubenswrapper[13205]: I0319 09:43:57.546604 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" event={"ID":"b3b14285-61c6-4760-85bc-64667c85f8af","Type":"ContainerStarted","Data":"0a77a45b32ab879d2fef6e2fdcfa263e7bf3823d0ba4b897d7f24aee6a6b359a"} Mar 19 09:43:57.556576 master-0 kubenswrapper[13205]: I0319 09:43:57.554828 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-8wzjf" event={"ID":"5c953c26-428d-4eb2-b88a-d98a21bb27c2","Type":"ContainerStarted","Data":"0c38dbabab657271998ac6d5b742ad6a17ac1d301622cd81f944d74aa1465ccf"} Mar 19 09:43:57.697690 master-0 kubenswrapper[13205]: I0319 09:43:57.681770 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" podStartSLOduration=2.702237018 podStartE2EDuration="10.681752127s" podCreationTimestamp="2026-03-19 09:43:47 +0000 UTC" firstStartedPulling="2026-03-19 09:43:48.565940262 +0000 UTC m=+1213.898247160" lastFinishedPulling="2026-03-19 09:43:56.545455381 +0000 UTC m=+1221.877762269" observedRunningTime="2026-03-19 09:43:57.591031095 +0000 UTC m=+1222.923337983" watchObservedRunningTime="2026-03-19 09:43:57.681752127 +0000 UTC m=+1223.014059015" Mar 19 09:43:57.697690 master-0 kubenswrapper[13205]: I0319 09:43:57.683065 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-8wzjf" podStartSLOduration=3.287657991 podStartE2EDuration="9.683057239s" podCreationTimestamp="2026-03-19 09:43:48 +0000 UTC" firstStartedPulling="2026-03-19 09:43:50.193118248 +0000 UTC m=+1215.525425136" lastFinishedPulling="2026-03-19 09:43:56.588517486 +0000 UTC m=+1221.920824384" observedRunningTime="2026-03-19 09:43:57.680677821 +0000 UTC m=+1223.012984709" watchObservedRunningTime="2026-03-19 09:43:57.683057239 +0000 UTC m=+1223.015364127" Mar 19 09:43:58.272287 master-0 kubenswrapper[13205]: W0319 09:43:58.268516 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47802e4a_72b0_4595_a33e_ca548f695f60.slice/crio-9562fd0b424ff8c11d04600ae11eb0516dfac760b93188d6b4b5c094d37b15fb WatchSource:0}: Error finding container 9562fd0b424ff8c11d04600ae11eb0516dfac760b93188d6b4b5c094d37b15fb: Status 404 returned error can't find the container with id 9562fd0b424ff8c11d04600ae11eb0516dfac760b93188d6b4b5c094d37b15fb Mar 19 09:43:58.273832 master-0 kubenswrapper[13205]: I0319 09:43:58.273796 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz"] Mar 19 09:43:58.587147 master-0 kubenswrapper[13205]: I0319 09:43:58.587099 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" event={"ID":"47802e4a-72b0-4595-a33e-ca548f695f60","Type":"ContainerStarted","Data":"9562fd0b424ff8c11d04600ae11eb0516dfac760b93188d6b4b5c094d37b15fb"} Mar 19 09:44:02.631561 master-0 kubenswrapper[13205]: I0319 09:44:02.631469 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-95c7k" event={"ID":"6ab170f8-577b-4f2c-a8f3-8fe1a5e45274","Type":"ContainerStarted","Data":"7bc756c3f86fa25e1e947047176667fabb0138416cf67288aa4394784635abbb"} Mar 19 09:44:02.661200 master-0 kubenswrapper[13205]: I0319 09:44:02.661110 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-95c7k" podStartSLOduration=4.187255692 podStartE2EDuration="9.661093876s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:56.395815018 +0000 UTC m=+1221.728121906" lastFinishedPulling="2026-03-19 09:44:01.869653212 +0000 UTC m=+1227.201960090" observedRunningTime="2026-03-19 09:44:02.653191865 +0000 UTC m=+1227.985498753" watchObservedRunningTime="2026-03-19 09:44:02.661093876 +0000 UTC m=+1227.993400764" Mar 19 09:44:03.037136 master-0 kubenswrapper[13205]: I0319 09:44:03.035992 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-ntbld" Mar 19 09:44:04.661711 master-0 kubenswrapper[13205]: I0319 09:44:04.658174 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" event={"ID":"b3b14285-61c6-4760-85bc-64667c85f8af","Type":"ContainerStarted","Data":"e53f7e299b20413ab325810a56706d8d135c896aa7ddd1f98d910c3277d89653"} Mar 19 09:44:04.661711 master-0 kubenswrapper[13205]: I0319 09:44:04.659283 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:44:04.682598 master-0 kubenswrapper[13205]: I0319 09:44:04.681880 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" podStartSLOduration=3.318215231 podStartE2EDuration="9.681858998s" podCreationTimestamp="2026-03-19 09:43:55 +0000 UTC" firstStartedPulling="2026-03-19 09:43:57.163479315 +0000 UTC m=+1222.495786203" lastFinishedPulling="2026-03-19 09:44:03.527123082 +0000 UTC m=+1228.859429970" observedRunningTime="2026-03-19 09:44:04.676737253 +0000 UTC m=+1230.009044131" watchObservedRunningTime="2026-03-19 09:44:04.681858998 +0000 UTC m=+1230.014165886" Mar 19 09:44:05.545750 master-0 kubenswrapper[13205]: I0319 09:44:05.545676 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-65r6f"] Mar 19 09:44:05.546951 master-0 kubenswrapper[13205]: I0319 09:44:05.546928 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-65r6f" Mar 19 09:44:05.553606 master-0 kubenswrapper[13205]: I0319 09:44:05.551681 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-65r6f"] Mar 19 09:44:05.647814 master-0 kubenswrapper[13205]: I0319 09:44:05.647750 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69bb71dd-c77e-44e6-8e15-bc8dc75f41ab-bound-sa-token\") pod \"cert-manager-545d4d4674-65r6f\" (UID: \"69bb71dd-c77e-44e6-8e15-bc8dc75f41ab\") " pod="cert-manager/cert-manager-545d4d4674-65r6f" Mar 19 09:44:05.647814 master-0 kubenswrapper[13205]: I0319 09:44:05.647796 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwz7s\" (UniqueName: \"kubernetes.io/projected/69bb71dd-c77e-44e6-8e15-bc8dc75f41ab-kube-api-access-bwz7s\") pod \"cert-manager-545d4d4674-65r6f\" (UID: \"69bb71dd-c77e-44e6-8e15-bc8dc75f41ab\") " pod="cert-manager/cert-manager-545d4d4674-65r6f" Mar 19 09:44:05.750617 master-0 kubenswrapper[13205]: I0319 09:44:05.749779 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69bb71dd-c77e-44e6-8e15-bc8dc75f41ab-bound-sa-token\") pod \"cert-manager-545d4d4674-65r6f\" (UID: \"69bb71dd-c77e-44e6-8e15-bc8dc75f41ab\") " pod="cert-manager/cert-manager-545d4d4674-65r6f" Mar 19 09:44:05.750617 master-0 kubenswrapper[13205]: I0319 09:44:05.749824 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwz7s\" (UniqueName: \"kubernetes.io/projected/69bb71dd-c77e-44e6-8e15-bc8dc75f41ab-kube-api-access-bwz7s\") pod \"cert-manager-545d4d4674-65r6f\" (UID: \"69bb71dd-c77e-44e6-8e15-bc8dc75f41ab\") " pod="cert-manager/cert-manager-545d4d4674-65r6f" Mar 19 09:44:05.778553 master-0 kubenswrapper[13205]: I0319 09:44:05.777656 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwz7s\" (UniqueName: \"kubernetes.io/projected/69bb71dd-c77e-44e6-8e15-bc8dc75f41ab-kube-api-access-bwz7s\") pod \"cert-manager-545d4d4674-65r6f\" (UID: \"69bb71dd-c77e-44e6-8e15-bc8dc75f41ab\") " pod="cert-manager/cert-manager-545d4d4674-65r6f" Mar 19 09:44:05.782609 master-0 kubenswrapper[13205]: I0319 09:44:05.781175 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69bb71dd-c77e-44e6-8e15-bc8dc75f41ab-bound-sa-token\") pod \"cert-manager-545d4d4674-65r6f\" (UID: \"69bb71dd-c77e-44e6-8e15-bc8dc75f41ab\") " pod="cert-manager/cert-manager-545d4d4674-65r6f" Mar 19 09:44:05.873033 master-0 kubenswrapper[13205]: I0319 09:44:05.872772 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-65r6f" Mar 19 09:44:06.662609 master-0 kubenswrapper[13205]: I0319 09:44:06.662562 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-65r6f"] Mar 19 09:44:06.673310 master-0 kubenswrapper[13205]: I0319 09:44:06.673266 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" event={"ID":"47802e4a-72b0-4595-a33e-ca548f695f60","Type":"ContainerStarted","Data":"3cc0ce6d280e8007c6211f53d6aeb76203118fdb7a3b92e1e727010d01341fc8"} Mar 19 09:44:06.673401 master-0 kubenswrapper[13205]: I0319 09:44:06.673345 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:44:06.674771 master-0 kubenswrapper[13205]: I0319 09:44:06.674725 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-65r6f" event={"ID":"69bb71dd-c77e-44e6-8e15-bc8dc75f41ab","Type":"ContainerStarted","Data":"0f4133f67ff54a537ef2506cc42b344189e118e9d1200cc3a3b24ed83704d26a"} Mar 19 09:44:06.702553 master-0 kubenswrapper[13205]: I0319 09:44:06.701637 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" podStartSLOduration=1.7065490479999998 podStartE2EDuration="9.701618603s" podCreationTimestamp="2026-03-19 09:43:57 +0000 UTC" firstStartedPulling="2026-03-19 09:43:58.283562139 +0000 UTC m=+1223.615869027" lastFinishedPulling="2026-03-19 09:44:06.278631694 +0000 UTC m=+1231.610938582" observedRunningTime="2026-03-19 09:44:06.695365652 +0000 UTC m=+1232.027672530" watchObservedRunningTime="2026-03-19 09:44:06.701618603 +0000 UTC m=+1232.033925491" Mar 19 09:44:07.686398 master-0 kubenswrapper[13205]: I0319 09:44:07.686312 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-65r6f" event={"ID":"69bb71dd-c77e-44e6-8e15-bc8dc75f41ab","Type":"ContainerStarted","Data":"c861ea326de8b620f38921375a608d3a43a5b1f5caf54ab7eb3f9111a4b4a983"} Mar 19 09:44:07.714168 master-0 kubenswrapper[13205]: I0319 09:44:07.714035 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-65r6f" podStartSLOduration=2.713998692 podStartE2EDuration="2.713998692s" podCreationTimestamp="2026-03-19 09:44:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:44:07.704662296 +0000 UTC m=+1233.036969224" watchObservedRunningTime="2026-03-19 09:44:07.713998692 +0000 UTC m=+1233.046305680" Mar 19 09:44:08.614760 master-0 kubenswrapper[13205]: I0319 09:44:08.614698 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-cn2rm"] Mar 19 09:44:08.619423 master-0 kubenswrapper[13205]: I0319 09:44:08.619378 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-cn2rm" Mar 19 09:44:08.624004 master-0 kubenswrapper[13205]: I0319 09:44:08.623940 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 19 09:44:08.625517 master-0 kubenswrapper[13205]: I0319 09:44:08.625484 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 19 09:44:08.645310 master-0 kubenswrapper[13205]: I0319 09:44:08.645248 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-cn2rm"] Mar 19 09:44:08.810048 master-0 kubenswrapper[13205]: I0319 09:44:08.809986 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwl8r\" (UniqueName: \"kubernetes.io/projected/4598a0ea-67ba-4982-bc67-e77c90508261-kube-api-access-pwl8r\") pod \"obo-prometheus-operator-8ff7d675-cn2rm\" (UID: \"4598a0ea-67ba-4982-bc67-e77c90508261\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-cn2rm" Mar 19 09:44:08.911219 master-0 kubenswrapper[13205]: I0319 09:44:08.911103 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwl8r\" (UniqueName: \"kubernetes.io/projected/4598a0ea-67ba-4982-bc67-e77c90508261-kube-api-access-pwl8r\") pod \"obo-prometheus-operator-8ff7d675-cn2rm\" (UID: \"4598a0ea-67ba-4982-bc67-e77c90508261\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-cn2rm" Mar 19 09:44:08.934671 master-0 kubenswrapper[13205]: I0319 09:44:08.934626 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwl8r\" (UniqueName: \"kubernetes.io/projected/4598a0ea-67ba-4982-bc67-e77c90508261-kube-api-access-pwl8r\") pod \"obo-prometheus-operator-8ff7d675-cn2rm\" (UID: \"4598a0ea-67ba-4982-bc67-e77c90508261\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-cn2rm" Mar 19 09:44:08.937577 master-0 kubenswrapper[13205]: I0319 09:44:08.937541 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-cn2rm" Mar 19 09:44:09.120183 master-0 kubenswrapper[13205]: I0319 09:44:09.120126 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2"] Mar 19 09:44:09.123904 master-0 kubenswrapper[13205]: I0319 09:44:09.121743 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2" Mar 19 09:44:09.138697 master-0 kubenswrapper[13205]: I0319 09:44:09.127688 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 19 09:44:09.191625 master-0 kubenswrapper[13205]: I0319 09:44:09.183023 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2"] Mar 19 09:44:09.204092 master-0 kubenswrapper[13205]: I0319 09:44:09.202380 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5"] Mar 19 09:44:09.204092 master-0 kubenswrapper[13205]: I0319 09:44:09.203898 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5" Mar 19 09:44:09.214035 master-0 kubenswrapper[13205]: I0319 09:44:09.213976 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5"] Mar 19 09:44:09.223641 master-0 kubenswrapper[13205]: I0319 09:44:09.221036 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00c14551-3cf9-4127-9412-3e40820820ea-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-64887bc684-vgrm2\" (UID: \"00c14551-3cf9-4127-9412-3e40820820ea\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2" Mar 19 09:44:09.223641 master-0 kubenswrapper[13205]: I0319 09:44:09.221137 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00c14551-3cf9-4127-9412-3e40820820ea-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-64887bc684-vgrm2\" (UID: \"00c14551-3cf9-4127-9412-3e40820820ea\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2" Mar 19 09:44:09.322683 master-0 kubenswrapper[13205]: I0319 09:44:09.322608 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00c14551-3cf9-4127-9412-3e40820820ea-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-64887bc684-vgrm2\" (UID: \"00c14551-3cf9-4127-9412-3e40820820ea\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2" Mar 19 09:44:09.322898 master-0 kubenswrapper[13205]: I0319 09:44:09.322703 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00c14551-3cf9-4127-9412-3e40820820ea-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-64887bc684-vgrm2\" (UID: \"00c14551-3cf9-4127-9412-3e40820820ea\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2" Mar 19 09:44:09.322898 master-0 kubenswrapper[13205]: I0319 09:44:09.322737 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1966ff81-3727-4448-8c89-7c412a6d7df2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-64887bc684-nrzl5\" (UID: \"1966ff81-3727-4448-8c89-7c412a6d7df2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5" Mar 19 09:44:09.322898 master-0 kubenswrapper[13205]: I0319 09:44:09.322793 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1966ff81-3727-4448-8c89-7c412a6d7df2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-64887bc684-nrzl5\" (UID: \"1966ff81-3727-4448-8c89-7c412a6d7df2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5" Mar 19 09:44:09.326683 master-0 kubenswrapper[13205]: I0319 09:44:09.325729 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00c14551-3cf9-4127-9412-3e40820820ea-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-64887bc684-vgrm2\" (UID: \"00c14551-3cf9-4127-9412-3e40820820ea\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2" Mar 19 09:44:09.326683 master-0 kubenswrapper[13205]: I0319 09:44:09.326205 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00c14551-3cf9-4127-9412-3e40820820ea-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-64887bc684-vgrm2\" (UID: \"00c14551-3cf9-4127-9412-3e40820820ea\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2" Mar 19 09:44:09.423975 master-0 kubenswrapper[13205]: I0319 09:44:09.423913 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1966ff81-3727-4448-8c89-7c412a6d7df2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-64887bc684-nrzl5\" (UID: \"1966ff81-3727-4448-8c89-7c412a6d7df2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5" Mar 19 09:44:09.424195 master-0 kubenswrapper[13205]: I0319 09:44:09.424018 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1966ff81-3727-4448-8c89-7c412a6d7df2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-64887bc684-nrzl5\" (UID: \"1966ff81-3727-4448-8c89-7c412a6d7df2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5" Mar 19 09:44:09.427544 master-0 kubenswrapper[13205]: I0319 09:44:09.426846 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1966ff81-3727-4448-8c89-7c412a6d7df2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-64887bc684-nrzl5\" (UID: \"1966ff81-3727-4448-8c89-7c412a6d7df2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5" Mar 19 09:44:09.434008 master-0 kubenswrapper[13205]: I0319 09:44:09.433959 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1966ff81-3727-4448-8c89-7c412a6d7df2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-64887bc684-nrzl5\" (UID: \"1966ff81-3727-4448-8c89-7c412a6d7df2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5" Mar 19 09:44:09.448938 master-0 kubenswrapper[13205]: I0319 09:44:09.448892 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2" Mar 19 09:44:09.452231 master-0 kubenswrapper[13205]: W0319 09:44:09.452160 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4598a0ea_67ba_4982_bc67_e77c90508261.slice/crio-154f8f289db6338969c6606c246941bb469a3f16d15a662f3eb6cfabb5d8d719 WatchSource:0}: Error finding container 154f8f289db6338969c6606c246941bb469a3f16d15a662f3eb6cfabb5d8d719: Status 404 returned error can't find the container with id 154f8f289db6338969c6606c246941bb469a3f16d15a662f3eb6cfabb5d8d719 Mar 19 09:44:09.455258 master-0 kubenswrapper[13205]: I0319 09:44:09.455199 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-cn2rm"] Mar 19 09:44:09.474704 master-0 kubenswrapper[13205]: I0319 09:44:09.474660 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-cgw5q"] Mar 19 09:44:09.475629 master-0 kubenswrapper[13205]: I0319 09:44:09.475599 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" Mar 19 09:44:09.478858 master-0 kubenswrapper[13205]: I0319 09:44:09.478816 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 19 09:44:09.527049 master-0 kubenswrapper[13205]: I0319 09:44:09.499414 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-cgw5q"] Mar 19 09:44:09.542553 master-0 kubenswrapper[13205]: I0319 09:44:09.538942 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5" Mar 19 09:44:09.632976 master-0 kubenswrapper[13205]: I0319 09:44:09.632918 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rl2v\" (UniqueName: \"kubernetes.io/projected/642e3b20-075e-4abe-9258-c47e385f1995-kube-api-access-2rl2v\") pod \"observability-operator-6dd7dd855f-cgw5q\" (UID: \"642e3b20-075e-4abe-9258-c47e385f1995\") " pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" Mar 19 09:44:09.633074 master-0 kubenswrapper[13205]: I0319 09:44:09.633014 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/642e3b20-075e-4abe-9258-c47e385f1995-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-cgw5q\" (UID: \"642e3b20-075e-4abe-9258-c47e385f1995\") " pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" Mar 19 09:44:09.718649 master-0 kubenswrapper[13205]: I0319 09:44:09.718595 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-cn2rm" event={"ID":"4598a0ea-67ba-4982-bc67-e77c90508261","Type":"ContainerStarted","Data":"154f8f289db6338969c6606c246941bb469a3f16d15a662f3eb6cfabb5d8d719"} Mar 19 09:44:09.744721 master-0 kubenswrapper[13205]: I0319 09:44:09.734546 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rl2v\" (UniqueName: \"kubernetes.io/projected/642e3b20-075e-4abe-9258-c47e385f1995-kube-api-access-2rl2v\") pod \"observability-operator-6dd7dd855f-cgw5q\" (UID: \"642e3b20-075e-4abe-9258-c47e385f1995\") " pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" Mar 19 09:44:09.744721 master-0 kubenswrapper[13205]: I0319 09:44:09.734610 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/642e3b20-075e-4abe-9258-c47e385f1995-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-cgw5q\" (UID: \"642e3b20-075e-4abe-9258-c47e385f1995\") " pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" Mar 19 09:44:09.744721 master-0 kubenswrapper[13205]: I0319 09:44:09.743951 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/642e3b20-075e-4abe-9258-c47e385f1995-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-cgw5q\" (UID: \"642e3b20-075e-4abe-9258-c47e385f1995\") " pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" Mar 19 09:44:09.762096 master-0 kubenswrapper[13205]: I0319 09:44:09.762045 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rl2v\" (UniqueName: \"kubernetes.io/projected/642e3b20-075e-4abe-9258-c47e385f1995-kube-api-access-2rl2v\") pod \"observability-operator-6dd7dd855f-cgw5q\" (UID: \"642e3b20-075e-4abe-9258-c47e385f1995\") " pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" Mar 19 09:44:09.833589 master-0 kubenswrapper[13205]: I0319 09:44:09.833105 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" Mar 19 09:44:09.950111 master-0 kubenswrapper[13205]: I0319 09:44:09.950010 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2"] Mar 19 09:44:10.008014 master-0 kubenswrapper[13205]: I0319 09:44:10.007959 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-d4f7cd8d5-2v5tt"] Mar 19 09:44:10.009639 master-0 kubenswrapper[13205]: I0319 09:44:10.009623 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.011518 master-0 kubenswrapper[13205]: I0319 09:44:10.011455 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 19 09:44:10.020408 master-0 kubenswrapper[13205]: I0319 09:44:10.020360 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-d4f7cd8d5-2v5tt"] Mar 19 09:44:10.075569 master-0 kubenswrapper[13205]: I0319 09:44:10.072415 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5"] Mar 19 09:44:10.155544 master-0 kubenswrapper[13205]: I0319 09:44:10.152929 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d9tv\" (UniqueName: \"kubernetes.io/projected/1ba0a560-3156-446f-b451-47f129706196-kube-api-access-5d9tv\") pod \"perses-operator-d4f7cd8d5-2v5tt\" (UID: \"1ba0a560-3156-446f-b451-47f129706196\") " pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.155544 master-0 kubenswrapper[13205]: I0319 09:44:10.153009 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ba0a560-3156-446f-b451-47f129706196-apiservice-cert\") pod \"perses-operator-d4f7cd8d5-2v5tt\" (UID: \"1ba0a560-3156-446f-b451-47f129706196\") " pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.155544 master-0 kubenswrapper[13205]: I0319 09:44:10.153042 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ba0a560-3156-446f-b451-47f129706196-webhook-cert\") pod \"perses-operator-d4f7cd8d5-2v5tt\" (UID: \"1ba0a560-3156-446f-b451-47f129706196\") " pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.155544 master-0 kubenswrapper[13205]: I0319 09:44:10.153085 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ba0a560-3156-446f-b451-47f129706196-openshift-service-ca\") pod \"perses-operator-d4f7cd8d5-2v5tt\" (UID: \"1ba0a560-3156-446f-b451-47f129706196\") " pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.254701 master-0 kubenswrapper[13205]: I0319 09:44:10.254577 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d9tv\" (UniqueName: \"kubernetes.io/projected/1ba0a560-3156-446f-b451-47f129706196-kube-api-access-5d9tv\") pod \"perses-operator-d4f7cd8d5-2v5tt\" (UID: \"1ba0a560-3156-446f-b451-47f129706196\") " pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.254701 master-0 kubenswrapper[13205]: I0319 09:44:10.254674 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ba0a560-3156-446f-b451-47f129706196-apiservice-cert\") pod \"perses-operator-d4f7cd8d5-2v5tt\" (UID: \"1ba0a560-3156-446f-b451-47f129706196\") " pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.254701 master-0 kubenswrapper[13205]: I0319 09:44:10.254707 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ba0a560-3156-446f-b451-47f129706196-webhook-cert\") pod \"perses-operator-d4f7cd8d5-2v5tt\" (UID: \"1ba0a560-3156-446f-b451-47f129706196\") " pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.254958 master-0 kubenswrapper[13205]: I0319 09:44:10.254748 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ba0a560-3156-446f-b451-47f129706196-openshift-service-ca\") pod \"perses-operator-d4f7cd8d5-2v5tt\" (UID: \"1ba0a560-3156-446f-b451-47f129706196\") " pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.255600 master-0 kubenswrapper[13205]: I0319 09:44:10.255570 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ba0a560-3156-446f-b451-47f129706196-openshift-service-ca\") pod \"perses-operator-d4f7cd8d5-2v5tt\" (UID: \"1ba0a560-3156-446f-b451-47f129706196\") " pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.258816 master-0 kubenswrapper[13205]: I0319 09:44:10.258770 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ba0a560-3156-446f-b451-47f129706196-apiservice-cert\") pod \"perses-operator-d4f7cd8d5-2v5tt\" (UID: \"1ba0a560-3156-446f-b451-47f129706196\") " pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.261558 master-0 kubenswrapper[13205]: I0319 09:44:10.261503 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ba0a560-3156-446f-b451-47f129706196-webhook-cert\") pod \"perses-operator-d4f7cd8d5-2v5tt\" (UID: \"1ba0a560-3156-446f-b451-47f129706196\") " pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.274493 master-0 kubenswrapper[13205]: I0319 09:44:10.274464 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d9tv\" (UniqueName: \"kubernetes.io/projected/1ba0a560-3156-446f-b451-47f129706196-kube-api-access-5d9tv\") pod \"perses-operator-d4f7cd8d5-2v5tt\" (UID: \"1ba0a560-3156-446f-b451-47f129706196\") " pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.314755 master-0 kubenswrapper[13205]: I0319 09:44:10.312684 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-cgw5q"] Mar 19 09:44:10.334277 master-0 kubenswrapper[13205]: I0319 09:44:10.334217 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:10.646924 master-0 kubenswrapper[13205]: I0319 09:44:10.645679 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-d4f7cd8d5-2v5tt"] Mar 19 09:44:10.727028 master-0 kubenswrapper[13205]: I0319 09:44:10.726971 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" event={"ID":"642e3b20-075e-4abe-9258-c47e385f1995","Type":"ContainerStarted","Data":"bdb91c89474645243fda1990ec9a45aca5e622dd51e6fed7f63ea2cc27e7548d"} Mar 19 09:44:10.728143 master-0 kubenswrapper[13205]: I0319 09:44:10.728109 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5" event={"ID":"1966ff81-3727-4448-8c89-7c412a6d7df2","Type":"ContainerStarted","Data":"f5547f66cd904c6aae693e7eca4a0ba5373917acb319672e7284a34d96d73245"} Mar 19 09:44:10.729194 master-0 kubenswrapper[13205]: I0319 09:44:10.729149 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" event={"ID":"1ba0a560-3156-446f-b451-47f129706196","Type":"ContainerStarted","Data":"996224bdb7685afbe35673ebf6dc66938e2fc8c43588001662191bff5881dca8"} Mar 19 09:44:10.730132 master-0 kubenswrapper[13205]: I0319 09:44:10.730106 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2" event={"ID":"00c14551-3cf9-4127-9412-3e40820820ea","Type":"ContainerStarted","Data":"d7e19df0e33d857684acf706ff739cf3c644ee25bf744124ce92edd895f6d39a"} Mar 19 09:44:17.487551 master-0 kubenswrapper[13205]: I0319 09:44:17.482859 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-69bcd667c-x84zz" Mar 19 09:44:19.826719 master-0 kubenswrapper[13205]: I0319 09:44:19.826440 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" event={"ID":"1ba0a560-3156-446f-b451-47f129706196","Type":"ContainerStarted","Data":"60ece54b379f6e5eab2482edcbcd6b2be9aa8b10270c3b67d9ebf5471507b7e9"} Mar 19 09:44:19.826719 master-0 kubenswrapper[13205]: I0319 09:44:19.826634 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:19.828738 master-0 kubenswrapper[13205]: I0319 09:44:19.828480 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2" event={"ID":"00c14551-3cf9-4127-9412-3e40820820ea","Type":"ContainerStarted","Data":"b268084e8194ff6fb9db2bf807b0d92220f5a855b14a6f52683020de44ac1a42"} Mar 19 09:44:19.830029 master-0 kubenswrapper[13205]: I0319 09:44:19.829993 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" event={"ID":"642e3b20-075e-4abe-9258-c47e385f1995","Type":"ContainerStarted","Data":"64d1bfd7c03a9249f630f935650f23ead87aa77ee3637cf01bff1a0806941fdd"} Mar 19 09:44:19.830270 master-0 kubenswrapper[13205]: I0319 09:44:19.830243 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" Mar 19 09:44:19.832312 master-0 kubenswrapper[13205]: I0319 09:44:19.832271 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5" event={"ID":"1966ff81-3727-4448-8c89-7c412a6d7df2","Type":"ContainerStarted","Data":"20457026c9796b3f28da15d97709fa722d7c6d678670bae1c06c5cbad3cd4fb1"} Mar 19 09:44:19.847926 master-0 kubenswrapper[13205]: I0319 09:44:19.837058 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-cn2rm" event={"ID":"4598a0ea-67ba-4982-bc67-e77c90508261","Type":"ContainerStarted","Data":"ce8c1220c0f7ae3164b9efc5c93e755d6aee5159ce5c6fa1be6cd2e92c02174f"} Mar 19 09:44:19.847926 master-0 kubenswrapper[13205]: I0319 09:44:19.838653 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" Mar 19 09:44:19.851154 master-0 kubenswrapper[13205]: I0319 09:44:19.851085 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" podStartSLOduration=2.839694798 podStartE2EDuration="10.85106348s" podCreationTimestamp="2026-03-19 09:44:09 +0000 UTC" firstStartedPulling="2026-03-19 09:44:10.668665895 +0000 UTC m=+1236.000972783" lastFinishedPulling="2026-03-19 09:44:18.680034557 +0000 UTC m=+1244.012341465" observedRunningTime="2026-03-19 09:44:19.847724118 +0000 UTC m=+1245.180031036" watchObservedRunningTime="2026-03-19 09:44:19.85106348 +0000 UTC m=+1245.183370368" Mar 19 09:44:19.888992 master-0 kubenswrapper[13205]: I0319 09:44:19.887676 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-cgw5q" podStartSLOduration=2.441486559 podStartE2EDuration="10.887656303s" podCreationTimestamp="2026-03-19 09:44:09 +0000 UTC" firstStartedPulling="2026-03-19 09:44:10.316407167 +0000 UTC m=+1235.648714055" lastFinishedPulling="2026-03-19 09:44:18.762576911 +0000 UTC m=+1244.094883799" observedRunningTime="2026-03-19 09:44:19.88631555 +0000 UTC m=+1245.218622488" watchObservedRunningTime="2026-03-19 09:44:19.887656303 +0000 UTC m=+1245.219963191" Mar 19 09:44:19.979548 master-0 kubenswrapper[13205]: I0319 09:44:19.979323 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-vgrm2" podStartSLOduration=2.222370379 podStartE2EDuration="10.979297709s" podCreationTimestamp="2026-03-19 09:44:09 +0000 UTC" firstStartedPulling="2026-03-19 09:44:09.983377448 +0000 UTC m=+1235.315684336" lastFinishedPulling="2026-03-19 09:44:18.740304778 +0000 UTC m=+1244.072611666" observedRunningTime="2026-03-19 09:44:19.953111561 +0000 UTC m=+1245.285418449" watchObservedRunningTime="2026-03-19 09:44:19.979297709 +0000 UTC m=+1245.311604597" Mar 19 09:44:20.087644 master-0 kubenswrapper[13205]: I0319 09:44:20.079567 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-64887bc684-nrzl5" podStartSLOduration=2.5450492970000003 podStartE2EDuration="11.079546677s" podCreationTimestamp="2026-03-19 09:44:09 +0000 UTC" firstStartedPulling="2026-03-19 09:44:10.146754946 +0000 UTC m=+1235.479061834" lastFinishedPulling="2026-03-19 09:44:18.681252316 +0000 UTC m=+1244.013559214" observedRunningTime="2026-03-19 09:44:20.024318298 +0000 UTC m=+1245.356625196" watchObservedRunningTime="2026-03-19 09:44:20.079546677 +0000 UTC m=+1245.411853575" Mar 19 09:44:20.116545 master-0 kubenswrapper[13205]: I0319 09:44:20.104639 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-cn2rm" podStartSLOduration=2.881430086 podStartE2EDuration="12.104617529s" podCreationTimestamp="2026-03-19 09:44:08 +0000 UTC" firstStartedPulling="2026-03-19 09:44:09.458630218 +0000 UTC m=+1234.790937106" lastFinishedPulling="2026-03-19 09:44:18.681817641 +0000 UTC m=+1244.014124549" observedRunningTime="2026-03-19 09:44:20.078420949 +0000 UTC m=+1245.410727837" watchObservedRunningTime="2026-03-19 09:44:20.104617529 +0000 UTC m=+1245.436924417" Mar 19 09:44:30.340045 master-0 kubenswrapper[13205]: I0319 09:44:30.339951 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-d4f7cd8d5-2v5tt" Mar 19 09:44:35.208629 master-0 kubenswrapper[13205]: E0319 09:44:35.208575 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:44:36.546338 master-0 kubenswrapper[13205]: I0319 09:44:36.546244 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-54b99f6f6b-hk7ql" Mar 19 09:44:45.370700 master-0 kubenswrapper[13205]: I0319 09:44:45.370640 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-88xgg"] Mar 19 09:44:45.373724 master-0 kubenswrapper[13205]: I0319 09:44:45.373694 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.375830 master-0 kubenswrapper[13205]: I0319 09:44:45.375790 13205 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 19 09:44:45.376068 master-0 kubenswrapper[13205]: I0319 09:44:45.376046 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 19 09:44:45.381436 master-0 kubenswrapper[13205]: I0319 09:44:45.381375 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9"] Mar 19 09:44:45.384936 master-0 kubenswrapper[13205]: I0319 09:44:45.384908 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" Mar 19 09:44:45.387665 master-0 kubenswrapper[13205]: I0319 09:44:45.387647 13205 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 19 09:44:45.402691 master-0 kubenswrapper[13205]: I0319 09:44:45.402617 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9"] Mar 19 09:44:45.530200 master-0 kubenswrapper[13205]: I0319 09:44:45.529362 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-dvddw"] Mar 19 09:44:45.530582 master-0 kubenswrapper[13205]: I0319 09:44:45.530554 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dvddw" Mar 19 09:44:45.536077 master-0 kubenswrapper[13205]: I0319 09:44:45.534834 13205 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 19 09:44:45.536077 master-0 kubenswrapper[13205]: I0319 09:44:45.534998 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 19 09:44:45.536077 master-0 kubenswrapper[13205]: I0319 09:44:45.535103 13205 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 19 09:44:45.540557 master-0 kubenswrapper[13205]: I0319 09:44:45.538887 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d1114a57-d615-4763-928e-664cf7513b52-metrics\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.540557 master-0 kubenswrapper[13205]: I0319 09:44:45.538935 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d1114a57-d615-4763-928e-664cf7513b52-frr-startup\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.540557 master-0 kubenswrapper[13205]: I0319 09:44:45.538954 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfdnx\" (UniqueName: \"kubernetes.io/projected/d081bb39-3cce-4a28-b53c-41414b48a4be-kube-api-access-jfdnx\") pod \"frr-k8s-webhook-server-bcc4b6f68-qvpr9\" (UID: \"d081bb39-3cce-4a28-b53c-41414b48a4be\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" Mar 19 09:44:45.540557 master-0 kubenswrapper[13205]: I0319 09:44:45.538972 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d1114a57-d615-4763-928e-664cf7513b52-frr-sockets\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.540557 master-0 kubenswrapper[13205]: I0319 09:44:45.539003 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jmv\" (UniqueName: \"kubernetes.io/projected/d1114a57-d615-4763-928e-664cf7513b52-kube-api-access-j7jmv\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.540557 master-0 kubenswrapper[13205]: I0319 09:44:45.539038 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1114a57-d615-4763-928e-664cf7513b52-metrics-certs\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.540557 master-0 kubenswrapper[13205]: I0319 09:44:45.539063 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d1114a57-d615-4763-928e-664cf7513b52-frr-conf\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.540557 master-0 kubenswrapper[13205]: I0319 09:44:45.539106 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d081bb39-3cce-4a28-b53c-41414b48a4be-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qvpr9\" (UID: \"d081bb39-3cce-4a28-b53c-41414b48a4be\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" Mar 19 09:44:45.540557 master-0 kubenswrapper[13205]: I0319 09:44:45.539131 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d1114a57-d615-4763-928e-664cf7513b52-reloader\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.541728 master-0 kubenswrapper[13205]: I0319 09:44:45.541694 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-kssp8"] Mar 19 09:44:45.542931 master-0 kubenswrapper[13205]: I0319 09:44:45.542903 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:44:45.550461 master-0 kubenswrapper[13205]: I0319 09:44:45.548059 13205 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 19 09:44:45.562197 master-0 kubenswrapper[13205]: I0319 09:44:45.561551 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-kssp8"] Mar 19 09:44:45.640691 master-0 kubenswrapper[13205]: I0319 09:44:45.640554 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jmv\" (UniqueName: \"kubernetes.io/projected/d1114a57-d615-4763-928e-664cf7513b52-kube-api-access-j7jmv\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.640691 master-0 kubenswrapper[13205]: I0319 09:44:45.640607 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3391abc7-00e2-4a16-95b0-4961dabda05b-memberlist\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:45.640691 master-0 kubenswrapper[13205]: I0319 09:44:45.640656 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1114a57-d615-4763-928e-664cf7513b52-metrics-certs\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.640691 master-0 kubenswrapper[13205]: I0319 09:44:45.640684 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmkv6\" (UniqueName: \"kubernetes.io/projected/3391abc7-00e2-4a16-95b0-4961dabda05b-kube-api-access-lmkv6\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:45.640973 master-0 kubenswrapper[13205]: I0319 09:44:45.640705 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d1114a57-d615-4763-928e-664cf7513b52-frr-conf\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.640973 master-0 kubenswrapper[13205]: I0319 09:44:45.640729 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3391abc7-00e2-4a16-95b0-4961dabda05b-metallb-excludel2\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:45.640973 master-0 kubenswrapper[13205]: I0319 09:44:45.640755 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3391abc7-00e2-4a16-95b0-4961dabda05b-metrics-certs\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:45.640973 master-0 kubenswrapper[13205]: I0319 09:44:45.640784 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d081bb39-3cce-4a28-b53c-41414b48a4be-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qvpr9\" (UID: \"d081bb39-3cce-4a28-b53c-41414b48a4be\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" Mar 19 09:44:45.640973 master-0 kubenswrapper[13205]: I0319 09:44:45.640811 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d1114a57-d615-4763-928e-664cf7513b52-reloader\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.640973 master-0 kubenswrapper[13205]: I0319 09:44:45.640837 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d1114a57-d615-4763-928e-664cf7513b52-metrics\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.640973 master-0 kubenswrapper[13205]: I0319 09:44:45.640860 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d1114a57-d615-4763-928e-664cf7513b52-frr-startup\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.640973 master-0 kubenswrapper[13205]: I0319 09:44:45.640876 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfdnx\" (UniqueName: \"kubernetes.io/projected/d081bb39-3cce-4a28-b53c-41414b48a4be-kube-api-access-jfdnx\") pod \"frr-k8s-webhook-server-bcc4b6f68-qvpr9\" (UID: \"d081bb39-3cce-4a28-b53c-41414b48a4be\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" Mar 19 09:44:45.640973 master-0 kubenswrapper[13205]: I0319 09:44:45.640890 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d1114a57-d615-4763-928e-664cf7513b52-frr-sockets\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.641291 master-0 kubenswrapper[13205]: I0319 09:44:45.641269 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d1114a57-d615-4763-928e-664cf7513b52-frr-sockets\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.641971 master-0 kubenswrapper[13205]: I0319 09:44:45.641940 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d1114a57-d615-4763-928e-664cf7513b52-reloader\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.642101 master-0 kubenswrapper[13205]: I0319 09:44:45.642063 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d1114a57-d615-4763-928e-664cf7513b52-frr-conf\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.642568 master-0 kubenswrapper[13205]: I0319 09:44:45.642547 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d1114a57-d615-4763-928e-664cf7513b52-metrics\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.642730 master-0 kubenswrapper[13205]: I0319 09:44:45.642693 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d1114a57-d615-4763-928e-664cf7513b52-frr-startup\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.645260 master-0 kubenswrapper[13205]: I0319 09:44:45.645229 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1114a57-d615-4763-928e-664cf7513b52-metrics-certs\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.645535 master-0 kubenswrapper[13205]: I0319 09:44:45.645478 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d081bb39-3cce-4a28-b53c-41414b48a4be-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qvpr9\" (UID: \"d081bb39-3cce-4a28-b53c-41414b48a4be\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" Mar 19 09:44:45.656625 master-0 kubenswrapper[13205]: I0319 09:44:45.656550 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfdnx\" (UniqueName: \"kubernetes.io/projected/d081bb39-3cce-4a28-b53c-41414b48a4be-kube-api-access-jfdnx\") pod \"frr-k8s-webhook-server-bcc4b6f68-qvpr9\" (UID: \"d081bb39-3cce-4a28-b53c-41414b48a4be\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" Mar 19 09:44:45.661549 master-0 kubenswrapper[13205]: I0319 09:44:45.660474 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jmv\" (UniqueName: \"kubernetes.io/projected/d1114a57-d615-4763-928e-664cf7513b52-kube-api-access-j7jmv\") pod \"frr-k8s-88xgg\" (UID: \"d1114a57-d615-4763-928e-664cf7513b52\") " pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.687388 master-0 kubenswrapper[13205]: I0319 09:44:45.687337 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-88xgg" Mar 19 09:44:45.697904 master-0 kubenswrapper[13205]: I0319 09:44:45.697840 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" Mar 19 09:44:45.742694 master-0 kubenswrapper[13205]: I0319 09:44:45.742647 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3391abc7-00e2-4a16-95b0-4961dabda05b-metrics-certs\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:45.742889 master-0 kubenswrapper[13205]: I0319 09:44:45.742724 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1df0e178-2040-45f9-8189-5c1d4bca71bd-metrics-certs\") pod \"controller-7bb4cc7c98-kssp8\" (UID: \"1df0e178-2040-45f9-8189-5c1d4bca71bd\") " pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:44:45.742889 master-0 kubenswrapper[13205]: I0319 09:44:45.742815 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1df0e178-2040-45f9-8189-5c1d4bca71bd-cert\") pod \"controller-7bb4cc7c98-kssp8\" (UID: \"1df0e178-2040-45f9-8189-5c1d4bca71bd\") " pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:44:45.743084 master-0 kubenswrapper[13205]: I0319 09:44:45.742851 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3391abc7-00e2-4a16-95b0-4961dabda05b-memberlist\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:45.743084 master-0 kubenswrapper[13205]: I0319 09:44:45.742940 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmkv6\" (UniqueName: \"kubernetes.io/projected/3391abc7-00e2-4a16-95b0-4961dabda05b-kube-api-access-lmkv6\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:45.743084 master-0 kubenswrapper[13205]: I0319 09:44:45.742991 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8hkr\" (UniqueName: \"kubernetes.io/projected/1df0e178-2040-45f9-8189-5c1d4bca71bd-kube-api-access-s8hkr\") pod \"controller-7bb4cc7c98-kssp8\" (UID: \"1df0e178-2040-45f9-8189-5c1d4bca71bd\") " pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:44:45.743084 master-0 kubenswrapper[13205]: I0319 09:44:45.743016 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3391abc7-00e2-4a16-95b0-4961dabda05b-metallb-excludel2\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:45.743759 master-0 kubenswrapper[13205]: E0319 09:44:45.743732 13205 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 09:44:45.743832 master-0 kubenswrapper[13205]: E0319 09:44:45.743819 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3391abc7-00e2-4a16-95b0-4961dabda05b-memberlist podName:3391abc7-00e2-4a16-95b0-4961dabda05b nodeName:}" failed. No retries permitted until 2026-03-19 09:44:46.243801956 +0000 UTC m=+1271.576108844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3391abc7-00e2-4a16-95b0-4961dabda05b-memberlist") pod "speaker-dvddw" (UID: "3391abc7-00e2-4a16-95b0-4961dabda05b") : secret "metallb-memberlist" not found Mar 19 09:44:45.745178 master-0 kubenswrapper[13205]: I0319 09:44:45.744812 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3391abc7-00e2-4a16-95b0-4961dabda05b-metallb-excludel2\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:45.749036 master-0 kubenswrapper[13205]: I0319 09:44:45.748783 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3391abc7-00e2-4a16-95b0-4961dabda05b-metrics-certs\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:45.780329 master-0 kubenswrapper[13205]: I0319 09:44:45.776565 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmkv6\" (UniqueName: \"kubernetes.io/projected/3391abc7-00e2-4a16-95b0-4961dabda05b-kube-api-access-lmkv6\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:45.844708 master-0 kubenswrapper[13205]: I0319 09:44:45.844028 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1df0e178-2040-45f9-8189-5c1d4bca71bd-cert\") pod \"controller-7bb4cc7c98-kssp8\" (UID: \"1df0e178-2040-45f9-8189-5c1d4bca71bd\") " pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:44:45.844708 master-0 kubenswrapper[13205]: I0319 09:44:45.844128 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8hkr\" (UniqueName: \"kubernetes.io/projected/1df0e178-2040-45f9-8189-5c1d4bca71bd-kube-api-access-s8hkr\") pod \"controller-7bb4cc7c98-kssp8\" (UID: \"1df0e178-2040-45f9-8189-5c1d4bca71bd\") " pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:44:45.844708 master-0 kubenswrapper[13205]: I0319 09:44:45.844174 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1df0e178-2040-45f9-8189-5c1d4bca71bd-metrics-certs\") pod \"controller-7bb4cc7c98-kssp8\" (UID: \"1df0e178-2040-45f9-8189-5c1d4bca71bd\") " pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:44:45.845883 master-0 kubenswrapper[13205]: I0319 09:44:45.845818 13205 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 09:44:45.848699 master-0 kubenswrapper[13205]: I0319 09:44:45.848650 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1df0e178-2040-45f9-8189-5c1d4bca71bd-metrics-certs\") pod \"controller-7bb4cc7c98-kssp8\" (UID: \"1df0e178-2040-45f9-8189-5c1d4bca71bd\") " pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:44:45.860840 master-0 kubenswrapper[13205]: I0319 09:44:45.860706 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1df0e178-2040-45f9-8189-5c1d4bca71bd-cert\") pod \"controller-7bb4cc7c98-kssp8\" (UID: \"1df0e178-2040-45f9-8189-5c1d4bca71bd\") " pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:44:45.866325 master-0 kubenswrapper[13205]: I0319 09:44:45.866287 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8hkr\" (UniqueName: \"kubernetes.io/projected/1df0e178-2040-45f9-8189-5c1d4bca71bd-kube-api-access-s8hkr\") pod \"controller-7bb4cc7c98-kssp8\" (UID: \"1df0e178-2040-45f9-8189-5c1d4bca71bd\") " pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:44:45.874451 master-0 kubenswrapper[13205]: I0319 09:44:45.874403 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:44:46.097467 master-0 kubenswrapper[13205]: I0319 09:44:46.097423 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9"] Mar 19 09:44:46.098796 master-0 kubenswrapper[13205]: W0319 09:44:46.098758 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd081bb39_3cce_4a28_b53c_41414b48a4be.slice/crio-18ceecb79efed20ba6e6ca4d378708d4a4510b0a6abebbc0b6ba7daa1e2e3ea6 WatchSource:0}: Error finding container 18ceecb79efed20ba6e6ca4d378708d4a4510b0a6abebbc0b6ba7daa1e2e3ea6: Status 404 returned error can't find the container with id 18ceecb79efed20ba6e6ca4d378708d4a4510b0a6abebbc0b6ba7daa1e2e3ea6 Mar 19 09:44:46.122963 master-0 kubenswrapper[13205]: I0319 09:44:46.122895 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88xgg" event={"ID":"d1114a57-d615-4763-928e-664cf7513b52","Type":"ContainerStarted","Data":"36e7458e47ed2a4689593eccb9561529be6740a0a360b03315a1f3f881217f11"} Mar 19 09:44:46.124200 master-0 kubenswrapper[13205]: I0319 09:44:46.124115 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" event={"ID":"d081bb39-3cce-4a28-b53c-41414b48a4be","Type":"ContainerStarted","Data":"18ceecb79efed20ba6e6ca4d378708d4a4510b0a6abebbc0b6ba7daa1e2e3ea6"} Mar 19 09:44:46.253094 master-0 kubenswrapper[13205]: I0319 09:44:46.252912 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3391abc7-00e2-4a16-95b0-4961dabda05b-memberlist\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:46.253447 master-0 kubenswrapper[13205]: E0319 09:44:46.253242 13205 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 09:44:46.253447 master-0 kubenswrapper[13205]: E0319 09:44:46.253359 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3391abc7-00e2-4a16-95b0-4961dabda05b-memberlist podName:3391abc7-00e2-4a16-95b0-4961dabda05b nodeName:}" failed. No retries permitted until 2026-03-19 09:44:47.253329533 +0000 UTC m=+1272.585636451 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3391abc7-00e2-4a16-95b0-4961dabda05b-memberlist") pod "speaker-dvddw" (UID: "3391abc7-00e2-4a16-95b0-4961dabda05b") : secret "metallb-memberlist" not found Mar 19 09:44:46.306129 master-0 kubenswrapper[13205]: I0319 09:44:46.306048 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-kssp8"] Mar 19 09:44:47.133455 master-0 kubenswrapper[13205]: I0319 09:44:47.133371 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kssp8" event={"ID":"1df0e178-2040-45f9-8189-5c1d4bca71bd","Type":"ContainerStarted","Data":"0dc147aa7c70809c4ff69b389733e755355e423a6d49e9b110db0c6db9daf9a8"} Mar 19 09:44:47.133455 master-0 kubenswrapper[13205]: I0319 09:44:47.133454 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kssp8" event={"ID":"1df0e178-2040-45f9-8189-5c1d4bca71bd","Type":"ContainerStarted","Data":"95e29f9c349fc0e4f1786010894c69aa49446db62c945eacf7356387091d3f8f"} Mar 19 09:44:47.268873 master-0 kubenswrapper[13205]: I0319 09:44:47.268793 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3391abc7-00e2-4a16-95b0-4961dabda05b-memberlist\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:47.271603 master-0 kubenswrapper[13205]: I0319 09:44:47.271570 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3391abc7-00e2-4a16-95b0-4961dabda05b-memberlist\") pod \"speaker-dvddw\" (UID: \"3391abc7-00e2-4a16-95b0-4961dabda05b\") " pod="metallb-system/speaker-dvddw" Mar 19 09:44:47.324342 master-0 kubenswrapper[13205]: I0319 09:44:47.324291 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-mxtkr"] Mar 19 09:44:47.328416 master-0 kubenswrapper[13205]: I0319 09:44:47.325758 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-mxtkr" Mar 19 09:44:47.358868 master-0 kubenswrapper[13205]: I0319 09:44:47.358437 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dvddw" Mar 19 09:44:47.377689 master-0 kubenswrapper[13205]: I0319 09:44:47.376034 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-mxtkr"] Mar 19 09:44:47.388072 master-0 kubenswrapper[13205]: I0319 09:44:47.387279 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-d44x4"] Mar 19 09:44:47.392979 master-0 kubenswrapper[13205]: I0319 09:44:47.388296 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" Mar 19 09:44:47.412343 master-0 kubenswrapper[13205]: I0319 09:44:47.411690 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 19 09:44:47.460806 master-0 kubenswrapper[13205]: I0319 09:44:47.460749 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-7bhhs"] Mar 19 09:44:47.467464 master-0 kubenswrapper[13205]: I0319 09:44:47.467388 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.475691 master-0 kubenswrapper[13205]: I0319 09:44:47.475461 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-d44x4"] Mar 19 09:44:47.475691 master-0 kubenswrapper[13205]: I0319 09:44:47.475465 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn2c2\" (UniqueName: \"kubernetes.io/projected/1d13de7b-9da4-4240-a24d-d85ff82b405e-kube-api-access-zn2c2\") pod \"nmstate-metrics-9b8c8685d-mxtkr\" (UID: \"1d13de7b-9da4-4240-a24d-d85ff82b405e\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-mxtkr" Mar 19 09:44:47.475691 master-0 kubenswrapper[13205]: I0319 09:44:47.475680 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/21ed9745-5a64-43ba-94be-b27034f5de86-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-d44x4\" (UID: \"21ed9745-5a64-43ba-94be-b27034f5de86\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" Mar 19 09:44:47.475860 master-0 kubenswrapper[13205]: I0319 09:44:47.475718 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7g7q\" (UniqueName: \"kubernetes.io/projected/21ed9745-5a64-43ba-94be-b27034f5de86-kube-api-access-c7g7q\") pod \"nmstate-webhook-5f558f5558-d44x4\" (UID: \"21ed9745-5a64-43ba-94be-b27034f5de86\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" Mar 19 09:44:47.561896 master-0 kubenswrapper[13205]: I0319 09:44:47.561553 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt"] Mar 19 09:44:47.562995 master-0 kubenswrapper[13205]: I0319 09:44:47.562668 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" Mar 19 09:44:47.568096 master-0 kubenswrapper[13205]: I0319 09:44:47.566633 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 19 09:44:47.568096 master-0 kubenswrapper[13205]: I0319 09:44:47.566650 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 19 09:44:47.573593 master-0 kubenswrapper[13205]: I0319 09:44:47.572985 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt"] Mar 19 09:44:47.580473 master-0 kubenswrapper[13205]: I0319 09:44:47.578061 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7-dbus-socket\") pod \"nmstate-handler-7bhhs\" (UID: \"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7\") " pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.580473 master-0 kubenswrapper[13205]: I0319 09:44:47.578248 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7-nmstate-lock\") pod \"nmstate-handler-7bhhs\" (UID: \"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7\") " pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.580473 master-0 kubenswrapper[13205]: I0319 09:44:47.578351 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/21ed9745-5a64-43ba-94be-b27034f5de86-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-d44x4\" (UID: \"21ed9745-5a64-43ba-94be-b27034f5de86\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" Mar 19 09:44:47.580473 master-0 kubenswrapper[13205]: I0319 09:44:47.579698 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7g7q\" (UniqueName: \"kubernetes.io/projected/21ed9745-5a64-43ba-94be-b27034f5de86-kube-api-access-c7g7q\") pod \"nmstate-webhook-5f558f5558-d44x4\" (UID: \"21ed9745-5a64-43ba-94be-b27034f5de86\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" Mar 19 09:44:47.583376 master-0 kubenswrapper[13205]: I0319 09:44:47.582168 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7-ovs-socket\") pod \"nmstate-handler-7bhhs\" (UID: \"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7\") " pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.583376 master-0 kubenswrapper[13205]: I0319 09:44:47.582314 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn2c2\" (UniqueName: \"kubernetes.io/projected/1d13de7b-9da4-4240-a24d-d85ff82b405e-kube-api-access-zn2c2\") pod \"nmstate-metrics-9b8c8685d-mxtkr\" (UID: \"1d13de7b-9da4-4240-a24d-d85ff82b405e\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-mxtkr" Mar 19 09:44:47.583376 master-0 kubenswrapper[13205]: I0319 09:44:47.582371 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq9bh\" (UniqueName: \"kubernetes.io/projected/eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7-kube-api-access-hq9bh\") pod \"nmstate-handler-7bhhs\" (UID: \"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7\") " pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.584803 master-0 kubenswrapper[13205]: I0319 09:44:47.584765 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/21ed9745-5a64-43ba-94be-b27034f5de86-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-d44x4\" (UID: \"21ed9745-5a64-43ba-94be-b27034f5de86\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" Mar 19 09:44:47.606395 master-0 kubenswrapper[13205]: I0319 09:44:47.606261 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn2c2\" (UniqueName: \"kubernetes.io/projected/1d13de7b-9da4-4240-a24d-d85ff82b405e-kube-api-access-zn2c2\") pod \"nmstate-metrics-9b8c8685d-mxtkr\" (UID: \"1d13de7b-9da4-4240-a24d-d85ff82b405e\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-mxtkr" Mar 19 09:44:47.610311 master-0 kubenswrapper[13205]: I0319 09:44:47.607693 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7g7q\" (UniqueName: \"kubernetes.io/projected/21ed9745-5a64-43ba-94be-b27034f5de86-kube-api-access-c7g7q\") pod \"nmstate-webhook-5f558f5558-d44x4\" (UID: \"21ed9745-5a64-43ba-94be-b27034f5de86\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" Mar 19 09:44:47.685064 master-0 kubenswrapper[13205]: I0319 09:44:47.684944 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7-dbus-socket\") pod \"nmstate-handler-7bhhs\" (UID: \"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7\") " pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.685064 master-0 kubenswrapper[13205]: I0319 09:44:47.685023 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7-nmstate-lock\") pod \"nmstate-handler-7bhhs\" (UID: \"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7\") " pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.685276 master-0 kubenswrapper[13205]: I0319 09:44:47.685085 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7-ovs-socket\") pod \"nmstate-handler-7bhhs\" (UID: \"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7\") " pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.685276 master-0 kubenswrapper[13205]: I0319 09:44:47.685119 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/23525aee-4327-4f3f-a471-501ab5740c98-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-k4lpt\" (UID: \"23525aee-4327-4f3f-a471-501ab5740c98\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" Mar 19 09:44:47.685276 master-0 kubenswrapper[13205]: I0319 09:44:47.685139 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9bh\" (UniqueName: \"kubernetes.io/projected/eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7-kube-api-access-hq9bh\") pod \"nmstate-handler-7bhhs\" (UID: \"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7\") " pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.685276 master-0 kubenswrapper[13205]: I0319 09:44:47.685163 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/23525aee-4327-4f3f-a471-501ab5740c98-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-k4lpt\" (UID: \"23525aee-4327-4f3f-a471-501ab5740c98\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" Mar 19 09:44:47.685276 master-0 kubenswrapper[13205]: I0319 09:44:47.685185 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9l8l\" (UniqueName: \"kubernetes.io/projected/23525aee-4327-4f3f-a471-501ab5740c98-kube-api-access-s9l8l\") pod \"nmstate-console-plugin-86f58fcf4-k4lpt\" (UID: \"23525aee-4327-4f3f-a471-501ab5740c98\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" Mar 19 09:44:47.685605 master-0 kubenswrapper[13205]: I0319 09:44:47.685467 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7-dbus-socket\") pod \"nmstate-handler-7bhhs\" (UID: \"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7\") " pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.685605 master-0 kubenswrapper[13205]: I0319 09:44:47.685512 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7-nmstate-lock\") pod \"nmstate-handler-7bhhs\" (UID: \"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7\") " pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.685605 master-0 kubenswrapper[13205]: I0319 09:44:47.685550 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7-ovs-socket\") pod \"nmstate-handler-7bhhs\" (UID: \"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7\") " pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.703699 master-0 kubenswrapper[13205]: I0319 09:44:47.703655 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq9bh\" (UniqueName: \"kubernetes.io/projected/eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7-kube-api-access-hq9bh\") pod \"nmstate-handler-7bhhs\" (UID: \"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7\") " pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.716216 master-0 kubenswrapper[13205]: I0319 09:44:47.716175 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-mxtkr" Mar 19 09:44:47.750799 master-0 kubenswrapper[13205]: I0319 09:44:47.750359 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" Mar 19 09:44:47.779552 master-0 kubenswrapper[13205]: I0319 09:44:47.778262 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5ccdf4d79d-4zsfz"] Mar 19 09:44:47.789039 master-0 kubenswrapper[13205]: I0319 09:44:47.779845 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.794648 master-0 kubenswrapper[13205]: I0319 09:44:47.790266 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/23525aee-4327-4f3f-a471-501ab5740c98-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-k4lpt\" (UID: \"23525aee-4327-4f3f-a471-501ab5740c98\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" Mar 19 09:44:47.794648 master-0 kubenswrapper[13205]: I0319 09:44:47.790337 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/23525aee-4327-4f3f-a471-501ab5740c98-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-k4lpt\" (UID: \"23525aee-4327-4f3f-a471-501ab5740c98\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" Mar 19 09:44:47.794648 master-0 kubenswrapper[13205]: I0319 09:44:47.790372 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9l8l\" (UniqueName: \"kubernetes.io/projected/23525aee-4327-4f3f-a471-501ab5740c98-kube-api-access-s9l8l\") pod \"nmstate-console-plugin-86f58fcf4-k4lpt\" (UID: \"23525aee-4327-4f3f-a471-501ab5740c98\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" Mar 19 09:44:47.794648 master-0 kubenswrapper[13205]: E0319 09:44:47.790832 13205 secret.go:189] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 19 09:44:47.794648 master-0 kubenswrapper[13205]: E0319 09:44:47.790890 13205 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23525aee-4327-4f3f-a471-501ab5740c98-plugin-serving-cert podName:23525aee-4327-4f3f-a471-501ab5740c98 nodeName:}" failed. No retries permitted until 2026-03-19 09:44:48.290871671 +0000 UTC m=+1273.623178559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/23525aee-4327-4f3f-a471-501ab5740c98-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-k4lpt" (UID: "23525aee-4327-4f3f-a471-501ab5740c98") : secret "plugin-serving-cert" not found Mar 19 09:44:47.794648 master-0 kubenswrapper[13205]: I0319 09:44:47.792298 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/23525aee-4327-4f3f-a471-501ab5740c98-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-k4lpt\" (UID: \"23525aee-4327-4f3f-a471-501ab5740c98\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" Mar 19 09:44:47.813360 master-0 kubenswrapper[13205]: I0319 09:44:47.812855 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5ccdf4d79d-4zsfz"] Mar 19 09:44:47.830516 master-0 kubenswrapper[13205]: I0319 09:44:47.830474 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9l8l\" (UniqueName: \"kubernetes.io/projected/23525aee-4327-4f3f-a471-501ab5740c98-kube-api-access-s9l8l\") pod \"nmstate-console-plugin-86f58fcf4-k4lpt\" (UID: \"23525aee-4327-4f3f-a471-501ab5740c98\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" Mar 19 09:44:47.883103 master-0 kubenswrapper[13205]: I0319 09:44:47.882497 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:47.895460 master-0 kubenswrapper[13205]: I0319 09:44:47.894092 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dea63849-8bd2-479d-b5bc-a9eaadba35f3-console-serving-cert\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.895460 master-0 kubenswrapper[13205]: I0319 09:44:47.894163 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dea63849-8bd2-479d-b5bc-a9eaadba35f3-service-ca\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.895460 master-0 kubenswrapper[13205]: I0319 09:44:47.894219 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dea63849-8bd2-479d-b5bc-a9eaadba35f3-console-config\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.895460 master-0 kubenswrapper[13205]: I0319 09:44:47.894242 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dea63849-8bd2-479d-b5bc-a9eaadba35f3-oauth-serving-cert\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.895460 master-0 kubenswrapper[13205]: I0319 09:44:47.894271 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s69tz\" (UniqueName: \"kubernetes.io/projected/dea63849-8bd2-479d-b5bc-a9eaadba35f3-kube-api-access-s69tz\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.895460 master-0 kubenswrapper[13205]: I0319 09:44:47.894322 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dea63849-8bd2-479d-b5bc-a9eaadba35f3-console-oauth-config\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.895460 master-0 kubenswrapper[13205]: I0319 09:44:47.894354 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dea63849-8bd2-479d-b5bc-a9eaadba35f3-trusted-ca-bundle\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.998590 master-0 kubenswrapper[13205]: I0319 09:44:47.996346 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dea63849-8bd2-479d-b5bc-a9eaadba35f3-console-serving-cert\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.998590 master-0 kubenswrapper[13205]: I0319 09:44:47.996422 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dea63849-8bd2-479d-b5bc-a9eaadba35f3-service-ca\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.998590 master-0 kubenswrapper[13205]: I0319 09:44:47.996479 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dea63849-8bd2-479d-b5bc-a9eaadba35f3-console-config\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.998590 master-0 kubenswrapper[13205]: I0319 09:44:47.996517 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dea63849-8bd2-479d-b5bc-a9eaadba35f3-oauth-serving-cert\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.998590 master-0 kubenswrapper[13205]: I0319 09:44:47.996603 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s69tz\" (UniqueName: \"kubernetes.io/projected/dea63849-8bd2-479d-b5bc-a9eaadba35f3-kube-api-access-s69tz\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.998590 master-0 kubenswrapper[13205]: I0319 09:44:47.996654 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dea63849-8bd2-479d-b5bc-a9eaadba35f3-console-oauth-config\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.998590 master-0 kubenswrapper[13205]: I0319 09:44:47.996682 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dea63849-8bd2-479d-b5bc-a9eaadba35f3-trusted-ca-bundle\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:47.998590 master-0 kubenswrapper[13205]: I0319 09:44:47.997774 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dea63849-8bd2-479d-b5bc-a9eaadba35f3-trusted-ca-bundle\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:48.002457 master-0 kubenswrapper[13205]: I0319 09:44:47.999504 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dea63849-8bd2-479d-b5bc-a9eaadba35f3-console-config\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:48.002457 master-0 kubenswrapper[13205]: I0319 09:44:47.999943 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dea63849-8bd2-479d-b5bc-a9eaadba35f3-service-ca\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:48.002457 master-0 kubenswrapper[13205]: I0319 09:44:48.000789 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dea63849-8bd2-479d-b5bc-a9eaadba35f3-oauth-serving-cert\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:48.005915 master-0 kubenswrapper[13205]: I0319 09:44:48.002910 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dea63849-8bd2-479d-b5bc-a9eaadba35f3-console-oauth-config\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:48.005915 master-0 kubenswrapper[13205]: I0319 09:44:48.003020 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dea63849-8bd2-479d-b5bc-a9eaadba35f3-console-serving-cert\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:48.156770 master-0 kubenswrapper[13205]: I0319 09:44:48.156706 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dvddw" event={"ID":"3391abc7-00e2-4a16-95b0-4961dabda05b","Type":"ContainerStarted","Data":"cccd3c2966a3e9ec318622dae225e5b95d40c4002e263eca3a1b91dcec05cb40"} Mar 19 09:44:48.156770 master-0 kubenswrapper[13205]: I0319 09:44:48.156765 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dvddw" event={"ID":"3391abc7-00e2-4a16-95b0-4961dabda05b","Type":"ContainerStarted","Data":"3b779baff2a6c762b594b192adf95e6bf362518442f1a91da1351194c5553473"} Mar 19 09:44:48.160508 master-0 kubenswrapper[13205]: I0319 09:44:48.160462 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kssp8" event={"ID":"1df0e178-2040-45f9-8189-5c1d4bca71bd","Type":"ContainerStarted","Data":"01a09c836bb116b894fc34ff7bd1eccddcc03a31128683169a1f8bff572e7598"} Mar 19 09:44:48.162361 master-0 kubenswrapper[13205]: I0319 09:44:48.162304 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7bhhs" event={"ID":"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7","Type":"ContainerStarted","Data":"8e36d865f368d47b524e533168f02b76c741eda0260e5381bad5a1df3da2199c"} Mar 19 09:44:48.186275 master-0 kubenswrapper[13205]: I0319 09:44:48.186232 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s69tz\" (UniqueName: \"kubernetes.io/projected/dea63849-8bd2-479d-b5bc-a9eaadba35f3-kube-api-access-s69tz\") pod \"console-5ccdf4d79d-4zsfz\" (UID: \"dea63849-8bd2-479d-b5bc-a9eaadba35f3\") " pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:48.301996 master-0 kubenswrapper[13205]: I0319 09:44:48.301943 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/23525aee-4327-4f3f-a471-501ab5740c98-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-k4lpt\" (UID: \"23525aee-4327-4f3f-a471-501ab5740c98\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" Mar 19 09:44:48.311496 master-0 kubenswrapper[13205]: I0319 09:44:48.308116 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/23525aee-4327-4f3f-a471-501ab5740c98-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-k4lpt\" (UID: \"23525aee-4327-4f3f-a471-501ab5740c98\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" Mar 19 09:44:48.437251 master-0 kubenswrapper[13205]: I0319 09:44:48.437186 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:48.519570 master-0 kubenswrapper[13205]: I0319 09:44:48.518349 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" Mar 19 09:44:48.604340 master-0 kubenswrapper[13205]: I0319 09:44:48.603828 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-mxtkr"] Mar 19 09:44:48.613447 master-0 kubenswrapper[13205]: I0319 09:44:48.612618 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-d44x4"] Mar 19 09:44:48.631124 master-0 kubenswrapper[13205]: W0319 09:44:48.631021 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21ed9745_5a64_43ba_94be_b27034f5de86.slice/crio-e99ce983067918e3fc122d4bb26c046e7654bd76533eee1cdb78e52e651804b9 WatchSource:0}: Error finding container e99ce983067918e3fc122d4bb26c046e7654bd76533eee1cdb78e52e651804b9: Status 404 returned error can't find the container with id e99ce983067918e3fc122d4bb26c046e7654bd76533eee1cdb78e52e651804b9 Mar 19 09:44:48.960590 master-0 kubenswrapper[13205]: I0319 09:44:48.960196 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5ccdf4d79d-4zsfz"] Mar 19 09:44:49.066422 master-0 kubenswrapper[13205]: W0319 09:44:49.065894 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23525aee_4327_4f3f_a471_501ab5740c98.slice/crio-7e3c43d196fb69af5c17e8950dc5523d1e344623ace971b3a2ef710dee872332 WatchSource:0}: Error finding container 7e3c43d196fb69af5c17e8950dc5523d1e344623ace971b3a2ef710dee872332: Status 404 returned error can't find the container with id 7e3c43d196fb69af5c17e8950dc5523d1e344623ace971b3a2ef710dee872332 Mar 19 09:44:49.070029 master-0 kubenswrapper[13205]: I0319 09:44:49.069956 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt"] Mar 19 09:44:49.193197 master-0 kubenswrapper[13205]: I0319 09:44:49.192823 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" event={"ID":"21ed9745-5a64-43ba-94be-b27034f5de86","Type":"ContainerStarted","Data":"e99ce983067918e3fc122d4bb26c046e7654bd76533eee1cdb78e52e651804b9"} Mar 19 09:44:49.195246 master-0 kubenswrapper[13205]: I0319 09:44:49.195203 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dvddw" event={"ID":"3391abc7-00e2-4a16-95b0-4961dabda05b","Type":"ContainerStarted","Data":"e9ad183f3d1f87347b9f7b5c62d95ab4057a871056aaac8671f2377ec58cbce3"} Mar 19 09:44:49.195845 master-0 kubenswrapper[13205]: I0319 09:44:49.195795 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-dvddw" Mar 19 09:44:49.198158 master-0 kubenswrapper[13205]: I0319 09:44:49.197853 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5ccdf4d79d-4zsfz" event={"ID":"dea63849-8bd2-479d-b5bc-a9eaadba35f3","Type":"ContainerStarted","Data":"578f96288d7b0998a1c00da556c8d8d3d656c30559ca2f05ba0fb5a3fcdc8cb0"} Mar 19 09:44:49.198158 master-0 kubenswrapper[13205]: I0319 09:44:49.197936 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5ccdf4d79d-4zsfz" event={"ID":"dea63849-8bd2-479d-b5bc-a9eaadba35f3","Type":"ContainerStarted","Data":"e5337e9cba7826ec7812767f3f0ddc116a661e69792f5fbfc5c106ac684607b8"} Mar 19 09:44:49.199977 master-0 kubenswrapper[13205]: I0319 09:44:49.199952 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-mxtkr" event={"ID":"1d13de7b-9da4-4240-a24d-d85ff82b405e","Type":"ContainerStarted","Data":"1c2fdac148e09a6c0cd757353fb602afb9cf04d074c623e148a40820e2486116"} Mar 19 09:44:49.201035 master-0 kubenswrapper[13205]: I0319 09:44:49.200991 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" event={"ID":"23525aee-4327-4f3f-a471-501ab5740c98","Type":"ContainerStarted","Data":"7e3c43d196fb69af5c17e8950dc5523d1e344623ace971b3a2ef710dee872332"} Mar 19 09:44:49.201120 master-0 kubenswrapper[13205]: I0319 09:44:49.201097 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:44:49.218656 master-0 kubenswrapper[13205]: I0319 09:44:49.218518 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-dvddw" podStartSLOduration=4.218500086 podStartE2EDuration="4.218500086s" podCreationTimestamp="2026-03-19 09:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:44:49.210389989 +0000 UTC m=+1274.542696877" watchObservedRunningTime="2026-03-19 09:44:49.218500086 +0000 UTC m=+1274.550806974" Mar 19 09:44:49.243930 master-0 kubenswrapper[13205]: I0319 09:44:49.243849 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-kssp8" podStartSLOduration=2.864256862 podStartE2EDuration="4.243831915s" podCreationTimestamp="2026-03-19 09:44:45 +0000 UTC" firstStartedPulling="2026-03-19 09:44:46.47366352 +0000 UTC m=+1271.805970408" lastFinishedPulling="2026-03-19 09:44:47.853238573 +0000 UTC m=+1273.185545461" observedRunningTime="2026-03-19 09:44:49.230385157 +0000 UTC m=+1274.562692045" watchObservedRunningTime="2026-03-19 09:44:49.243831915 +0000 UTC m=+1274.576138803" Mar 19 09:44:49.257200 master-0 kubenswrapper[13205]: I0319 09:44:49.254318 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5ccdf4d79d-4zsfz" podStartSLOduration=2.254301681 podStartE2EDuration="2.254301681s" podCreationTimestamp="2026-03-19 09:44:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:44:49.250744213 +0000 UTC m=+1274.583051101" watchObservedRunningTime="2026-03-19 09:44:49.254301681 +0000 UTC m=+1274.586608569" Mar 19 09:44:51.967125 master-0 kubenswrapper[13205]: I0319 09:44:51.967061 13205 scope.go:117] "RemoveContainer" containerID="34566cfceb793a1b567a5c645aa383f0affc1644709bb43d497052e54db18d78" Mar 19 09:44:56.269349 master-0 kubenswrapper[13205]: I0319 09:44:56.269297 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" event={"ID":"d081bb39-3cce-4a28-b53c-41414b48a4be","Type":"ContainerStarted","Data":"682eac50f5dc302a2be8837481830dc296d886d85b53c57738575170580b8228"} Mar 19 09:44:56.270063 master-0 kubenswrapper[13205]: I0319 09:44:56.269401 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" Mar 19 09:44:56.272193 master-0 kubenswrapper[13205]: I0319 09:44:56.272151 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-mxtkr" event={"ID":"1d13de7b-9da4-4240-a24d-d85ff82b405e","Type":"ContainerStarted","Data":"5a1d5718bc433d0ed7e44ab8e80a2bface8f335f09c93b9c5a15d54c394062c7"} Mar 19 09:44:56.280586 master-0 kubenswrapper[13205]: I0319 09:44:56.280167 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" event={"ID":"23525aee-4327-4f3f-a471-501ab5740c98","Type":"ContainerStarted","Data":"cfa5abfa2e81ec52d12a4b98dc0ae6234407050ce9e918e0256ac8b010a5148a"} Mar 19 09:44:56.288023 master-0 kubenswrapper[13205]: I0319 09:44:56.286472 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" event={"ID":"21ed9745-5a64-43ba-94be-b27034f5de86","Type":"ContainerStarted","Data":"b8fba574f8fc8c423b6ad69d18c4f75e48b6bff11e827b5cd9b102c8592800a9"} Mar 19 09:44:56.288023 master-0 kubenswrapper[13205]: I0319 09:44:56.287302 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" Mar 19 09:44:56.297078 master-0 kubenswrapper[13205]: I0319 09:44:56.295173 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" podStartSLOduration=1.444011137 podStartE2EDuration="11.295140585s" podCreationTimestamp="2026-03-19 09:44:45 +0000 UTC" firstStartedPulling="2026-03-19 09:44:46.10165141 +0000 UTC m=+1271.433958308" lastFinishedPulling="2026-03-19 09:44:55.952780858 +0000 UTC m=+1281.285087756" observedRunningTime="2026-03-19 09:44:56.291400734 +0000 UTC m=+1281.623707702" watchObservedRunningTime="2026-03-19 09:44:56.295140585 +0000 UTC m=+1281.627447473" Mar 19 09:44:56.331588 master-0 kubenswrapper[13205]: I0319 09:44:56.331508 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" podStartSLOduration=2.024696637 podStartE2EDuration="9.331489293s" podCreationTimestamp="2026-03-19 09:44:47 +0000 UTC" firstStartedPulling="2026-03-19 09:44:48.667980429 +0000 UTC m=+1274.000287317" lastFinishedPulling="2026-03-19 09:44:55.974773065 +0000 UTC m=+1281.307079973" observedRunningTime="2026-03-19 09:44:56.324289626 +0000 UTC m=+1281.656596514" watchObservedRunningTime="2026-03-19 09:44:56.331489293 +0000 UTC m=+1281.663796181" Mar 19 09:44:56.344047 master-0 kubenswrapper[13205]: I0319 09:44:56.343965 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k4lpt" podStartSLOduration=2.455891571 podStartE2EDuration="9.343947436s" podCreationTimestamp="2026-03-19 09:44:47 +0000 UTC" firstStartedPulling="2026-03-19 09:44:49.068000763 +0000 UTC m=+1274.400307651" lastFinishedPulling="2026-03-19 09:44:55.956056608 +0000 UTC m=+1281.288363516" observedRunningTime="2026-03-19 09:44:56.341773613 +0000 UTC m=+1281.674080511" watchObservedRunningTime="2026-03-19 09:44:56.343947436 +0000 UTC m=+1281.676254324" Mar 19 09:44:57.301473 master-0 kubenswrapper[13205]: I0319 09:44:57.301380 13205 generic.go:334] "Generic (PLEG): container finished" podID="d1114a57-d615-4763-928e-664cf7513b52" containerID="aeb0f60343128a0394f9ee3778ac31d58096dc0d8cef84b8d3035ec9c1415ea7" exitCode=0 Mar 19 09:44:57.301473 master-0 kubenswrapper[13205]: I0319 09:44:57.301462 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88xgg" event={"ID":"d1114a57-d615-4763-928e-664cf7513b52","Type":"ContainerDied","Data":"aeb0f60343128a0394f9ee3778ac31d58096dc0d8cef84b8d3035ec9c1415ea7"} Mar 19 09:44:57.305404 master-0 kubenswrapper[13205]: I0319 09:44:57.305332 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7bhhs" event={"ID":"eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7","Type":"ContainerStarted","Data":"bcc928a9d5b8d697c42a8147bbb504b1b1db12a1ac1edd46d214ea7701d1d172"} Mar 19 09:44:57.305724 master-0 kubenswrapper[13205]: I0319 09:44:57.305675 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:44:57.313164 master-0 kubenswrapper[13205]: I0319 09:44:57.313084 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-mxtkr" event={"ID":"1d13de7b-9da4-4240-a24d-d85ff82b405e","Type":"ContainerStarted","Data":"bc498e1efad55c406137322dd5ca938f58984101ed381d588045d52d0c13c931"} Mar 19 09:44:57.361327 master-0 kubenswrapper[13205]: I0319 09:44:57.359910 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-mxtkr" podStartSLOduration=3.07689912 podStartE2EDuration="10.359891384s" podCreationTimestamp="2026-03-19 09:44:47 +0000 UTC" firstStartedPulling="2026-03-19 09:44:48.668600355 +0000 UTC m=+1274.000907243" lastFinishedPulling="2026-03-19 09:44:55.951592609 +0000 UTC m=+1281.283899507" observedRunningTime="2026-03-19 09:44:57.357612018 +0000 UTC m=+1282.689918966" watchObservedRunningTime="2026-03-19 09:44:57.359891384 +0000 UTC m=+1282.692198292" Mar 19 09:44:57.363897 master-0 kubenswrapper[13205]: I0319 09:44:57.363836 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-dvddw" Mar 19 09:44:57.427700 master-0 kubenswrapper[13205]: I0319 09:44:57.427398 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-7bhhs" podStartSLOduration=2.4135016670000002 podStartE2EDuration="10.427377401s" podCreationTimestamp="2026-03-19 09:44:47 +0000 UTC" firstStartedPulling="2026-03-19 09:44:47.939213142 +0000 UTC m=+1273.271520040" lastFinishedPulling="2026-03-19 09:44:55.953088866 +0000 UTC m=+1281.285395774" observedRunningTime="2026-03-19 09:44:57.415689146 +0000 UTC m=+1282.747996034" watchObservedRunningTime="2026-03-19 09:44:57.427377401 +0000 UTC m=+1282.759684289" Mar 19 09:44:58.330838 master-0 kubenswrapper[13205]: I0319 09:44:58.330758 13205 generic.go:334] "Generic (PLEG): container finished" podID="d1114a57-d615-4763-928e-664cf7513b52" containerID="1ef4d43d087f20713215634b2bc9a8f95c4395c2b48d1ac2c0d81ded913e9e04" exitCode=0 Mar 19 09:44:58.331550 master-0 kubenswrapper[13205]: I0319 09:44:58.330986 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88xgg" event={"ID":"d1114a57-d615-4763-928e-664cf7513b52","Type":"ContainerDied","Data":"1ef4d43d087f20713215634b2bc9a8f95c4395c2b48d1ac2c0d81ded913e9e04"} Mar 19 09:44:58.437734 master-0 kubenswrapper[13205]: I0319 09:44:58.437682 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:58.437734 master-0 kubenswrapper[13205]: I0319 09:44:58.437731 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:58.441412 master-0 kubenswrapper[13205]: I0319 09:44:58.441368 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:59.346295 master-0 kubenswrapper[13205]: I0319 09:44:59.346239 13205 generic.go:334] "Generic (PLEG): container finished" podID="d1114a57-d615-4763-928e-664cf7513b52" containerID="f00fca09884ff1d56b1ef49ab869143351349f69206ba130e2ef9fa89bede5cb" exitCode=0 Mar 19 09:44:59.347059 master-0 kubenswrapper[13205]: I0319 09:44:59.347025 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88xgg" event={"ID":"d1114a57-d615-4763-928e-664cf7513b52","Type":"ContainerDied","Data":"f00fca09884ff1d56b1ef49ab869143351349f69206ba130e2ef9fa89bede5cb"} Mar 19 09:44:59.352326 master-0 kubenswrapper[13205]: I0319 09:44:59.352277 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5ccdf4d79d-4zsfz" Mar 19 09:44:59.494733 master-0 kubenswrapper[13205]: I0319 09:44:59.494687 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b9f6c6556-vscqq"] Mar 19 09:45:00.363806 master-0 kubenswrapper[13205]: I0319 09:45:00.362070 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88xgg" event={"ID":"d1114a57-d615-4763-928e-664cf7513b52","Type":"ContainerStarted","Data":"cd80101ca5ea945803798ad771ac75dfd767eefda44f125ad1342abd42f407c8"} Mar 19 09:45:00.363806 master-0 kubenswrapper[13205]: I0319 09:45:00.362119 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88xgg" event={"ID":"d1114a57-d615-4763-928e-664cf7513b52","Type":"ContainerStarted","Data":"006f0a9187dcfd47b1d74f7d8473ab7796619a6e26a56ff74a2f1eef56b5fb1f"} Mar 19 09:45:00.363806 master-0 kubenswrapper[13205]: I0319 09:45:00.362132 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88xgg" event={"ID":"d1114a57-d615-4763-928e-664cf7513b52","Type":"ContainerStarted","Data":"ec05e7ea37baeebf2faddea8bf14daac43f578ce5064a799830403894032750e"} Mar 19 09:45:00.363806 master-0 kubenswrapper[13205]: I0319 09:45:00.362146 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88xgg" event={"ID":"d1114a57-d615-4763-928e-664cf7513b52","Type":"ContainerStarted","Data":"0ac5a9385f81e1fee3323922e828338c30935ebd50834b115a3e6a86f30149af"} Mar 19 09:45:00.363806 master-0 kubenswrapper[13205]: I0319 09:45:00.362157 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88xgg" event={"ID":"d1114a57-d615-4763-928e-664cf7513b52","Type":"ContainerStarted","Data":"3f53db95b819bff48ccf70a4e2643a7d91791ba20a3633ee46d9b19ea773176a"} Mar 19 09:45:01.377769 master-0 kubenswrapper[13205]: I0319 09:45:01.377716 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-88xgg" event={"ID":"d1114a57-d615-4763-928e-664cf7513b52","Type":"ContainerStarted","Data":"b9bd935a9e7f85d311afcd2daa70cd757ff604df297eac3e85c97db5c448c0e7"} Mar 19 09:45:01.378384 master-0 kubenswrapper[13205]: I0319 09:45:01.378065 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-88xgg" Mar 19 09:45:02.927577 master-0 kubenswrapper[13205]: I0319 09:45:02.927336 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-7bhhs" Mar 19 09:45:02.970579 master-0 kubenswrapper[13205]: I0319 09:45:02.969653 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-88xgg" podStartSLOduration=7.796591622 podStartE2EDuration="17.969624807s" podCreationTimestamp="2026-03-19 09:44:45 +0000 UTC" firstStartedPulling="2026-03-19 09:44:45.864302527 +0000 UTC m=+1271.196609415" lastFinishedPulling="2026-03-19 09:44:56.037335712 +0000 UTC m=+1281.369642600" observedRunningTime="2026-03-19 09:45:01.404489055 +0000 UTC m=+1286.736795953" watchObservedRunningTime="2026-03-19 09:45:02.969624807 +0000 UTC m=+1288.301931735" Mar 19 09:45:05.687757 master-0 kubenswrapper[13205]: I0319 09:45:05.687665 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-88xgg" Mar 19 09:45:05.750245 master-0 kubenswrapper[13205]: I0319 09:45:05.750145 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-88xgg" Mar 19 09:45:05.881407 master-0 kubenswrapper[13205]: I0319 09:45:05.881308 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-kssp8" Mar 19 09:45:07.760944 master-0 kubenswrapper[13205]: I0319 09:45:07.760884 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d44x4" Mar 19 09:45:12.571212 master-0 kubenswrapper[13205]: I0319 09:45:12.571164 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-pclnw"] Mar 19 09:45:12.572915 master-0 kubenswrapper[13205]: I0319 09:45:12.572897 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.576750 master-0 kubenswrapper[13205]: I0319 09:45:12.576408 13205 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Mar 19 09:45:12.597451 master-0 kubenswrapper[13205]: I0319 09:45:12.597022 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-pclnw"] Mar 19 09:45:12.723273 master-0 kubenswrapper[13205]: I0319 09:45:12.722778 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-sys\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.723273 master-0 kubenswrapper[13205]: I0319 09:45:12.722837 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-pod-volumes-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.723273 master-0 kubenswrapper[13205]: I0319 09:45:12.722863 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-csi-plugin-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.723273 master-0 kubenswrapper[13205]: I0319 09:45:12.722893 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-lvmd-config\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.723273 master-0 kubenswrapper[13205]: I0319 09:45:12.722920 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-file-lock-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.723273 master-0 kubenswrapper[13205]: I0319 09:45:12.722957 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-registration-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.723273 master-0 kubenswrapper[13205]: I0319 09:45:12.722988 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-device-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.723273 master-0 kubenswrapper[13205]: I0319 09:45:12.723010 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-node-plugin-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.723273 master-0 kubenswrapper[13205]: I0319 09:45:12.723079 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-run-udev\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.723273 master-0 kubenswrapper[13205]: I0319 09:45:12.723104 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2jzl\" (UniqueName: \"kubernetes.io/projected/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-kube-api-access-r2jzl\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.723273 master-0 kubenswrapper[13205]: I0319 09:45:12.723127 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-metrics-cert\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.824419 master-0 kubenswrapper[13205]: I0319 09:45:12.824282 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-pod-volumes-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.824419 master-0 kubenswrapper[13205]: I0319 09:45:12.824334 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-csi-plugin-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.824419 master-0 kubenswrapper[13205]: I0319 09:45:12.824363 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-lvmd-config\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.824743 master-0 kubenswrapper[13205]: I0319 09:45:12.824517 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-file-lock-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.824743 master-0 kubenswrapper[13205]: I0319 09:45:12.824603 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-lvmd-config\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.824743 master-0 kubenswrapper[13205]: I0319 09:45:12.824665 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-registration-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.824743 master-0 kubenswrapper[13205]: I0319 09:45:12.824716 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-device-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.824907 master-0 kubenswrapper[13205]: I0319 09:45:12.824733 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-csi-plugin-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.824907 master-0 kubenswrapper[13205]: I0319 09:45:12.824757 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-node-plugin-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.824907 master-0 kubenswrapper[13205]: I0319 09:45:12.824774 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-device-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.824907 master-0 kubenswrapper[13205]: I0319 09:45:12.824759 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-registration-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.824907 master-0 kubenswrapper[13205]: I0319 09:45:12.824879 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-file-lock-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.825122 master-0 kubenswrapper[13205]: I0319 09:45:12.824929 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-node-plugin-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.825122 master-0 kubenswrapper[13205]: I0319 09:45:12.824955 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-run-udev\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.825122 master-0 kubenswrapper[13205]: I0319 09:45:12.824987 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2jzl\" (UniqueName: \"kubernetes.io/projected/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-kube-api-access-r2jzl\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.825122 master-0 kubenswrapper[13205]: I0319 09:45:12.824998 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-pod-volumes-dir\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.825122 master-0 kubenswrapper[13205]: I0319 09:45:12.825011 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-metrics-cert\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.825122 master-0 kubenswrapper[13205]: I0319 09:45:12.825028 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-run-udev\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.825122 master-0 kubenswrapper[13205]: I0319 09:45:12.825109 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-sys\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.825483 master-0 kubenswrapper[13205]: I0319 09:45:12.825244 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-sys\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.827952 master-0 kubenswrapper[13205]: I0319 09:45:12.827926 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-metrics-cert\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.849100 master-0 kubenswrapper[13205]: I0319 09:45:12.849032 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2jzl\" (UniqueName: \"kubernetes.io/projected/ab2297f7-d3ff-4033-b3c1-fa30756a6e9a-kube-api-access-r2jzl\") pod \"vg-manager-pclnw\" (UID: \"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a\") " pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:12.905264 master-0 kubenswrapper[13205]: I0319 09:45:12.905190 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:13.358296 master-0 kubenswrapper[13205]: I0319 09:45:13.358209 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-pclnw"] Mar 19 09:45:13.363072 master-0 kubenswrapper[13205]: W0319 09:45:13.363009 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab2297f7_d3ff_4033_b3c1_fa30756a6e9a.slice/crio-5c7bf177c124076916fe14664a21d5f97cbbe3c566dee2ec4fc4044d7082311d WatchSource:0}: Error finding container 5c7bf177c124076916fe14664a21d5f97cbbe3c566dee2ec4fc4044d7082311d: Status 404 returned error can't find the container with id 5c7bf177c124076916fe14664a21d5f97cbbe3c566dee2ec4fc4044d7082311d Mar 19 09:45:13.514346 master-0 kubenswrapper[13205]: I0319 09:45:13.513635 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-pclnw" event={"ID":"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a","Type":"ContainerStarted","Data":"5c7bf177c124076916fe14664a21d5f97cbbe3c566dee2ec4fc4044d7082311d"} Mar 19 09:45:13.545260 master-0 kubenswrapper[13205]: I0319 09:45:13.545150 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-pclnw" podStartSLOduration=1.5451268150000002 podStartE2EDuration="1.545126815s" podCreationTimestamp="2026-03-19 09:45:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:45:13.535671934 +0000 UTC m=+1298.867978832" watchObservedRunningTime="2026-03-19 09:45:13.545126815 +0000 UTC m=+1298.877433703" Mar 19 09:45:14.554557 master-0 kubenswrapper[13205]: I0319 09:45:14.548841 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-pclnw" event={"ID":"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a","Type":"ContainerStarted","Data":"330524fd183e4597277ee510ec67591bc599ee2891ebc2e878d816673966f0e4"} Mar 19 09:45:15.564746 master-0 kubenswrapper[13205]: I0319 09:45:15.563961 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-pclnw_ab2297f7-d3ff-4033-b3c1-fa30756a6e9a/vg-manager/0.log" Mar 19 09:45:15.564746 master-0 kubenswrapper[13205]: I0319 09:45:15.564011 13205 generic.go:334] "Generic (PLEG): container finished" podID="ab2297f7-d3ff-4033-b3c1-fa30756a6e9a" containerID="330524fd183e4597277ee510ec67591bc599ee2891ebc2e878d816673966f0e4" exitCode=1 Mar 19 09:45:15.564746 master-0 kubenswrapper[13205]: I0319 09:45:15.564677 13205 scope.go:117] "RemoveContainer" containerID="330524fd183e4597277ee510ec67591bc599ee2891ebc2e878d816673966f0e4" Mar 19 09:45:15.565455 master-0 kubenswrapper[13205]: I0319 09:45:15.564039 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-pclnw" event={"ID":"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a","Type":"ContainerDied","Data":"330524fd183e4597277ee510ec67591bc599ee2891ebc2e878d816673966f0e4"} Mar 19 09:45:15.689657 master-0 kubenswrapper[13205]: I0319 09:45:15.689594 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-88xgg" Mar 19 09:45:15.704701 master-0 kubenswrapper[13205]: I0319 09:45:15.704645 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qvpr9" Mar 19 09:45:15.981619 master-0 kubenswrapper[13205]: I0319 09:45:15.981100 13205 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Mar 19 09:45:16.206614 master-0 kubenswrapper[13205]: I0319 09:45:16.206479 13205 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2026-03-19T09:45:15.981131634Z","Handler":null,"Name":""} Mar 19 09:45:16.209075 master-0 kubenswrapper[13205]: I0319 09:45:16.209045 13205 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Mar 19 09:45:16.209153 master-0 kubenswrapper[13205]: I0319 09:45:16.209088 13205 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Mar 19 09:45:16.588599 master-0 kubenswrapper[13205]: I0319 09:45:16.584853 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-pclnw_ab2297f7-d3ff-4033-b3c1-fa30756a6e9a/vg-manager/0.log" Mar 19 09:45:16.588599 master-0 kubenswrapper[13205]: I0319 09:45:16.584915 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-pclnw" event={"ID":"ab2297f7-d3ff-4033-b3c1-fa30756a6e9a","Type":"ContainerStarted","Data":"11e75a99248715735fb473b085f6b7b756a72e05a9b14457f632476e44ecd7f5"} Mar 19 09:45:19.357156 master-0 kubenswrapper[13205]: I0319 09:45:19.357096 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6hs5j"] Mar 19 09:45:19.358300 master-0 kubenswrapper[13205]: I0319 09:45:19.358272 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6hs5j" Mar 19 09:45:19.366768 master-0 kubenswrapper[13205]: I0319 09:45:19.366695 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 19 09:45:19.366993 master-0 kubenswrapper[13205]: I0319 09:45:19.366945 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 19 09:45:19.386617 master-0 kubenswrapper[13205]: I0319 09:45:19.386379 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6hs5j"] Mar 19 09:45:19.469106 master-0 kubenswrapper[13205]: I0319 09:45:19.469051 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv6mg\" (UniqueName: \"kubernetes.io/projected/e9e5abec-3ac3-4b91-b51d-74efffaff8f8-kube-api-access-gv6mg\") pod \"openstack-operator-index-6hs5j\" (UID: \"e9e5abec-3ac3-4b91-b51d-74efffaff8f8\") " pod="openstack-operators/openstack-operator-index-6hs5j" Mar 19 09:45:19.571111 master-0 kubenswrapper[13205]: I0319 09:45:19.571018 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv6mg\" (UniqueName: \"kubernetes.io/projected/e9e5abec-3ac3-4b91-b51d-74efffaff8f8-kube-api-access-gv6mg\") pod \"openstack-operator-index-6hs5j\" (UID: \"e9e5abec-3ac3-4b91-b51d-74efffaff8f8\") " pod="openstack-operators/openstack-operator-index-6hs5j" Mar 19 09:45:19.597247 master-0 kubenswrapper[13205]: I0319 09:45:19.597214 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv6mg\" (UniqueName: \"kubernetes.io/projected/e9e5abec-3ac3-4b91-b51d-74efffaff8f8-kube-api-access-gv6mg\") pod \"openstack-operator-index-6hs5j\" (UID: \"e9e5abec-3ac3-4b91-b51d-74efffaff8f8\") " pod="openstack-operators/openstack-operator-index-6hs5j" Mar 19 09:45:19.681732 master-0 kubenswrapper[13205]: I0319 09:45:19.681116 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6hs5j" Mar 19 09:45:20.201750 master-0 kubenswrapper[13205]: I0319 09:45:20.201673 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6hs5j"] Mar 19 09:45:20.208823 master-0 kubenswrapper[13205]: W0319 09:45:20.208756 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9e5abec_3ac3_4b91_b51d_74efffaff8f8.slice/crio-3569a09b115dc1b9e0b64917b4e42ae3a3311cffd8074fbbcc4d7aa79614fab6 WatchSource:0}: Error finding container 3569a09b115dc1b9e0b64917b4e42ae3a3311cffd8074fbbcc4d7aa79614fab6: Status 404 returned error can't find the container with id 3569a09b115dc1b9e0b64917b4e42ae3a3311cffd8074fbbcc4d7aa79614fab6 Mar 19 09:45:20.628852 master-0 kubenswrapper[13205]: I0319 09:45:20.628766 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6hs5j" event={"ID":"e9e5abec-3ac3-4b91-b51d-74efffaff8f8","Type":"ContainerStarted","Data":"3569a09b115dc1b9e0b64917b4e42ae3a3311cffd8074fbbcc4d7aa79614fab6"} Mar 19 09:45:22.657711 master-0 kubenswrapper[13205]: I0319 09:45:22.657179 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6hs5j" event={"ID":"e9e5abec-3ac3-4b91-b51d-74efffaff8f8","Type":"ContainerStarted","Data":"c56fb13a55de2bdd3a7b612d2ff3b410d36cd538467fe55b5e895249d3fba044"} Mar 19 09:45:22.692434 master-0 kubenswrapper[13205]: I0319 09:45:22.692327 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6hs5j" podStartSLOduration=2.056384141 podStartE2EDuration="3.692307691s" podCreationTimestamp="2026-03-19 09:45:19 +0000 UTC" firstStartedPulling="2026-03-19 09:45:20.211494389 +0000 UTC m=+1305.543801277" lastFinishedPulling="2026-03-19 09:45:21.847417939 +0000 UTC m=+1307.179724827" observedRunningTime="2026-03-19 09:45:22.687596036 +0000 UTC m=+1308.019902934" watchObservedRunningTime="2026-03-19 09:45:22.692307691 +0000 UTC m=+1308.024614579" Mar 19 09:45:22.906047 master-0 kubenswrapper[13205]: I0319 09:45:22.905964 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:22.908501 master-0 kubenswrapper[13205]: I0319 09:45:22.908391 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:23.665110 master-0 kubenswrapper[13205]: I0319 09:45:23.665059 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:23.666051 master-0 kubenswrapper[13205]: I0319 09:45:23.665976 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-pclnw" Mar 19 09:45:24.577668 master-0 kubenswrapper[13205]: I0319 09:45:24.577479 13205 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-b9f6c6556-vscqq" podUID="78e6748d-be2a-4245-8a1e-567e1d7aa398" containerName="console" containerID="cri-o://66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24" gracePeriod=15 Mar 19 09:45:25.076188 master-0 kubenswrapper[13205]: I0319 09:45:25.076150 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b9f6c6556-vscqq_78e6748d-be2a-4245-8a1e-567e1d7aa398/console/0.log" Mar 19 09:45:25.077105 master-0 kubenswrapper[13205]: I0319 09:45:25.076233 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:45:25.193235 master-0 kubenswrapper[13205]: I0319 09:45:25.192825 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-oauth-config\") pod \"78e6748d-be2a-4245-8a1e-567e1d7aa398\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " Mar 19 09:45:25.193453 master-0 kubenswrapper[13205]: I0319 09:45:25.193267 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-trusted-ca-bundle\") pod \"78e6748d-be2a-4245-8a1e-567e1d7aa398\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " Mar 19 09:45:25.193453 master-0 kubenswrapper[13205]: I0319 09:45:25.193309 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-oauth-serving-cert\") pod \"78e6748d-be2a-4245-8a1e-567e1d7aa398\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " Mar 19 09:45:25.193453 master-0 kubenswrapper[13205]: I0319 09:45:25.193331 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-config\") pod \"78e6748d-be2a-4245-8a1e-567e1d7aa398\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " Mar 19 09:45:25.193453 master-0 kubenswrapper[13205]: I0319 09:45:25.193351 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-serving-cert\") pod \"78e6748d-be2a-4245-8a1e-567e1d7aa398\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " Mar 19 09:45:25.193453 master-0 kubenswrapper[13205]: I0319 09:45:25.193385 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-service-ca\") pod \"78e6748d-be2a-4245-8a1e-567e1d7aa398\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " Mar 19 09:45:25.193643 master-0 kubenswrapper[13205]: I0319 09:45:25.193477 13205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b77rk\" (UniqueName: \"kubernetes.io/projected/78e6748d-be2a-4245-8a1e-567e1d7aa398-kube-api-access-b77rk\") pod \"78e6748d-be2a-4245-8a1e-567e1d7aa398\" (UID: \"78e6748d-be2a-4245-8a1e-567e1d7aa398\") " Mar 19 09:45:25.194500 master-0 kubenswrapper[13205]: I0319 09:45:25.194459 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "78e6748d-be2a-4245-8a1e-567e1d7aa398" (UID: "78e6748d-be2a-4245-8a1e-567e1d7aa398"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:45:25.194886 master-0 kubenswrapper[13205]: I0319 09:45:25.194506 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-config" (OuterVolumeSpecName: "console-config") pod "78e6748d-be2a-4245-8a1e-567e1d7aa398" (UID: "78e6748d-be2a-4245-8a1e-567e1d7aa398"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:45:25.194886 master-0 kubenswrapper[13205]: I0319 09:45:25.194668 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-service-ca" (OuterVolumeSpecName: "service-ca") pod "78e6748d-be2a-4245-8a1e-567e1d7aa398" (UID: "78e6748d-be2a-4245-8a1e-567e1d7aa398"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:45:25.194886 master-0 kubenswrapper[13205]: I0319 09:45:25.194781 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "78e6748d-be2a-4245-8a1e-567e1d7aa398" (UID: "78e6748d-be2a-4245-8a1e-567e1d7aa398"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:45:25.197176 master-0 kubenswrapper[13205]: I0319 09:45:25.197109 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "78e6748d-be2a-4245-8a1e-567e1d7aa398" (UID: "78e6748d-be2a-4245-8a1e-567e1d7aa398"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:45:25.197260 master-0 kubenswrapper[13205]: I0319 09:45:25.197136 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "78e6748d-be2a-4245-8a1e-567e1d7aa398" (UID: "78e6748d-be2a-4245-8a1e-567e1d7aa398"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:45:25.198568 master-0 kubenswrapper[13205]: I0319 09:45:25.198521 13205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78e6748d-be2a-4245-8a1e-567e1d7aa398-kube-api-access-b77rk" (OuterVolumeSpecName: "kube-api-access-b77rk") pod "78e6748d-be2a-4245-8a1e-567e1d7aa398" (UID: "78e6748d-be2a-4245-8a1e-567e1d7aa398"). InnerVolumeSpecName "kube-api-access-b77rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:45:25.295894 master-0 kubenswrapper[13205]: I0319 09:45:25.295720 13205 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b77rk\" (UniqueName: \"kubernetes.io/projected/78e6748d-be2a-4245-8a1e-567e1d7aa398-kube-api-access-b77rk\") on node \"master-0\" DevicePath \"\"" Mar 19 09:45:25.295894 master-0 kubenswrapper[13205]: I0319 09:45:25.295762 13205 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:45:25.295894 master-0 kubenswrapper[13205]: I0319 09:45:25.295777 13205 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:45:25.295894 master-0 kubenswrapper[13205]: I0319 09:45:25.295791 13205 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:45:25.295894 master-0 kubenswrapper[13205]: I0319 09:45:25.295805 13205 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:45:25.295894 master-0 kubenswrapper[13205]: I0319 09:45:25.295816 13205 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/78e6748d-be2a-4245-8a1e-567e1d7aa398-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:45:25.295894 master-0 kubenswrapper[13205]: I0319 09:45:25.295830 13205 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78e6748d-be2a-4245-8a1e-567e1d7aa398-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:45:25.686478 master-0 kubenswrapper[13205]: I0319 09:45:25.686289 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b9f6c6556-vscqq_78e6748d-be2a-4245-8a1e-567e1d7aa398/console/0.log" Mar 19 09:45:25.686478 master-0 kubenswrapper[13205]: I0319 09:45:25.686358 13205 generic.go:334] "Generic (PLEG): container finished" podID="78e6748d-be2a-4245-8a1e-567e1d7aa398" containerID="66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24" exitCode=2 Mar 19 09:45:25.686478 master-0 kubenswrapper[13205]: I0319 09:45:25.686422 13205 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b9f6c6556-vscqq" Mar 19 09:45:25.686478 master-0 kubenswrapper[13205]: I0319 09:45:25.686433 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b9f6c6556-vscqq" event={"ID":"78e6748d-be2a-4245-8a1e-567e1d7aa398","Type":"ContainerDied","Data":"66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24"} Mar 19 09:45:25.687005 master-0 kubenswrapper[13205]: I0319 09:45:25.686514 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b9f6c6556-vscqq" event={"ID":"78e6748d-be2a-4245-8a1e-567e1d7aa398","Type":"ContainerDied","Data":"eec2b36a43be39dc2f660e26c0b9b8f2ee01a436e6bd29f57aa2cdf5ccf63113"} Mar 19 09:45:25.687005 master-0 kubenswrapper[13205]: I0319 09:45:25.686572 13205 scope.go:117] "RemoveContainer" containerID="66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24" Mar 19 09:45:25.715433 master-0 kubenswrapper[13205]: I0319 09:45:25.715384 13205 scope.go:117] "RemoveContainer" containerID="66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24" Mar 19 09:45:25.715898 master-0 kubenswrapper[13205]: E0319 09:45:25.715853 13205 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24\": container with ID starting with 66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24 not found: ID does not exist" containerID="66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24" Mar 19 09:45:25.716022 master-0 kubenswrapper[13205]: I0319 09:45:25.715935 13205 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24"} err="failed to get container status \"66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24\": rpc error: code = NotFound desc = could not find container \"66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24\": container with ID starting with 66d8022cecf101d2b09c5eae183d8c39949ace8b978b6eecc1c0ed9b179a4f24 not found: ID does not exist" Mar 19 09:45:25.742769 master-0 kubenswrapper[13205]: I0319 09:45:25.741901 13205 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b9f6c6556-vscqq"] Mar 19 09:45:25.750052 master-0 kubenswrapper[13205]: I0319 09:45:25.749960 13205 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b9f6c6556-vscqq"] Mar 19 09:45:26.863734 master-0 kubenswrapper[13205]: I0319 09:45:26.863666 13205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78e6748d-be2a-4245-8a1e-567e1d7aa398" path="/var/lib/kubelet/pods/78e6748d-be2a-4245-8a1e-567e1d7aa398/volumes" Mar 19 09:45:29.682024 master-0 kubenswrapper[13205]: I0319 09:45:29.681949 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-6hs5j" Mar 19 09:45:29.683004 master-0 kubenswrapper[13205]: I0319 09:45:29.682970 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-6hs5j" Mar 19 09:45:29.732215 master-0 kubenswrapper[13205]: I0319 09:45:29.732151 13205 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-6hs5j" Mar 19 09:45:29.789950 master-0 kubenswrapper[13205]: I0319 09:45:29.789885 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-6hs5j" Mar 19 09:45:35.221125 master-0 kubenswrapper[13205]: E0319 09:45:35.221037 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:46:35.234955 master-0 kubenswrapper[13205]: E0319 09:46:35.234803 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:47:35.239373 master-0 kubenswrapper[13205]: E0319 09:47:35.239302 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:48:35.214261 master-0 kubenswrapper[13205]: E0319 09:48:35.214208 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:49:35.225399 master-0 kubenswrapper[13205]: E0319 09:49:35.225306 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:50:31.062547 master-0 kubenswrapper[13205]: I0319 09:50:31.055777 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h6v96/must-gather-w4w2j"] Mar 19 09:50:31.062547 master-0 kubenswrapper[13205]: E0319 09:50:31.056197 13205 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78e6748d-be2a-4245-8a1e-567e1d7aa398" containerName="console" Mar 19 09:50:31.062547 master-0 kubenswrapper[13205]: I0319 09:50:31.056209 13205 state_mem.go:107] "Deleted CPUSet assignment" podUID="78e6748d-be2a-4245-8a1e-567e1d7aa398" containerName="console" Mar 19 09:50:31.062547 master-0 kubenswrapper[13205]: I0319 09:50:31.056348 13205 memory_manager.go:354] "RemoveStaleState removing state" podUID="78e6748d-be2a-4245-8a1e-567e1d7aa398" containerName="console" Mar 19 09:50:31.062547 master-0 kubenswrapper[13205]: I0319 09:50:31.057171 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h6v96/must-gather-w4w2j" Mar 19 09:50:31.062547 master-0 kubenswrapper[13205]: I0319 09:50:31.058956 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-h6v96"/"kube-root-ca.crt" Mar 19 09:50:31.062547 master-0 kubenswrapper[13205]: I0319 09:50:31.059785 13205 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-h6v96"/"openshift-service-ca.crt" Mar 19 09:50:31.094632 master-0 kubenswrapper[13205]: I0319 09:50:31.094584 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h6v96/must-gather-d24cd"] Mar 19 09:50:31.097085 master-0 kubenswrapper[13205]: I0319 09:50:31.096215 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h6v96/must-gather-d24cd" Mar 19 09:50:31.122440 master-0 kubenswrapper[13205]: I0319 09:50:31.121968 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h6v96/must-gather-w4w2j"] Mar 19 09:50:31.141705 master-0 kubenswrapper[13205]: I0319 09:50:31.141638 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h6v96/must-gather-d24cd"] Mar 19 09:50:31.246320 master-0 kubenswrapper[13205]: I0319 09:50:31.246235 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b-must-gather-output\") pod \"must-gather-w4w2j\" (UID: \"2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b\") " pod="openshift-must-gather-h6v96/must-gather-w4w2j" Mar 19 09:50:31.246603 master-0 kubenswrapper[13205]: I0319 09:50:31.246338 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8rl\" (UniqueName: \"kubernetes.io/projected/2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b-kube-api-access-7k8rl\") pod \"must-gather-w4w2j\" (UID: \"2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b\") " pod="openshift-must-gather-h6v96/must-gather-w4w2j" Mar 19 09:50:31.246603 master-0 kubenswrapper[13205]: I0319 09:50:31.246403 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhm52\" (UniqueName: \"kubernetes.io/projected/722f7ae2-0359-4df4-a8a0-f92023ead304-kube-api-access-nhm52\") pod \"must-gather-d24cd\" (UID: \"722f7ae2-0359-4df4-a8a0-f92023ead304\") " pod="openshift-must-gather-h6v96/must-gather-d24cd" Mar 19 09:50:31.246603 master-0 kubenswrapper[13205]: I0319 09:50:31.246423 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/722f7ae2-0359-4df4-a8a0-f92023ead304-must-gather-output\") pod \"must-gather-d24cd\" (UID: \"722f7ae2-0359-4df4-a8a0-f92023ead304\") " pod="openshift-must-gather-h6v96/must-gather-d24cd" Mar 19 09:50:31.348295 master-0 kubenswrapper[13205]: I0319 09:50:31.348135 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhm52\" (UniqueName: \"kubernetes.io/projected/722f7ae2-0359-4df4-a8a0-f92023ead304-kube-api-access-nhm52\") pod \"must-gather-d24cd\" (UID: \"722f7ae2-0359-4df4-a8a0-f92023ead304\") " pod="openshift-must-gather-h6v96/must-gather-d24cd" Mar 19 09:50:31.348295 master-0 kubenswrapper[13205]: I0319 09:50:31.348229 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/722f7ae2-0359-4df4-a8a0-f92023ead304-must-gather-output\") pod \"must-gather-d24cd\" (UID: \"722f7ae2-0359-4df4-a8a0-f92023ead304\") " pod="openshift-must-gather-h6v96/must-gather-d24cd" Mar 19 09:50:31.348582 master-0 kubenswrapper[13205]: I0319 09:50:31.348357 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b-must-gather-output\") pod \"must-gather-w4w2j\" (UID: \"2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b\") " pod="openshift-must-gather-h6v96/must-gather-w4w2j" Mar 19 09:50:31.348582 master-0 kubenswrapper[13205]: I0319 09:50:31.348430 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8rl\" (UniqueName: \"kubernetes.io/projected/2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b-kube-api-access-7k8rl\") pod \"must-gather-w4w2j\" (UID: \"2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b\") " pod="openshift-must-gather-h6v96/must-gather-w4w2j" Mar 19 09:50:31.348847 master-0 kubenswrapper[13205]: I0319 09:50:31.348810 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/722f7ae2-0359-4df4-a8a0-f92023ead304-must-gather-output\") pod \"must-gather-d24cd\" (UID: \"722f7ae2-0359-4df4-a8a0-f92023ead304\") " pod="openshift-must-gather-h6v96/must-gather-d24cd" Mar 19 09:50:31.349039 master-0 kubenswrapper[13205]: I0319 09:50:31.349004 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b-must-gather-output\") pod \"must-gather-w4w2j\" (UID: \"2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b\") " pod="openshift-must-gather-h6v96/must-gather-w4w2j" Mar 19 09:50:31.367318 master-0 kubenswrapper[13205]: I0319 09:50:31.367261 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhm52\" (UniqueName: \"kubernetes.io/projected/722f7ae2-0359-4df4-a8a0-f92023ead304-kube-api-access-nhm52\") pod \"must-gather-d24cd\" (UID: \"722f7ae2-0359-4df4-a8a0-f92023ead304\") " pod="openshift-must-gather-h6v96/must-gather-d24cd" Mar 19 09:50:31.367318 master-0 kubenswrapper[13205]: I0319 09:50:31.367298 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8rl\" (UniqueName: \"kubernetes.io/projected/2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b-kube-api-access-7k8rl\") pod \"must-gather-w4w2j\" (UID: \"2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b\") " pod="openshift-must-gather-h6v96/must-gather-w4w2j" Mar 19 09:50:31.390518 master-0 kubenswrapper[13205]: I0319 09:50:31.390450 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h6v96/must-gather-w4w2j" Mar 19 09:50:31.416152 master-0 kubenswrapper[13205]: I0319 09:50:31.416081 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h6v96/must-gather-d24cd" Mar 19 09:50:31.840193 master-0 kubenswrapper[13205]: I0319 09:50:31.840139 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h6v96/must-gather-w4w2j"] Mar 19 09:50:31.840656 master-0 kubenswrapper[13205]: W0319 09:50:31.840602 13205 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eba4a2a_4a9b_4f25_9e3d_01e9495caf3b.slice/crio-25c1c5ccddb1254a34031c6a9a8c8bf11f4ee3f350151e4c8a40210180bf2055 WatchSource:0}: Error finding container 25c1c5ccddb1254a34031c6a9a8c8bf11f4ee3f350151e4c8a40210180bf2055: Status 404 returned error can't find the container with id 25c1c5ccddb1254a34031c6a9a8c8bf11f4ee3f350151e4c8a40210180bf2055 Mar 19 09:50:31.842372 master-0 kubenswrapper[13205]: I0319 09:50:31.842325 13205 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:50:31.895347 master-0 kubenswrapper[13205]: I0319 09:50:31.895201 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h6v96/must-gather-d24cd"] Mar 19 09:50:32.093803 master-0 kubenswrapper[13205]: I0319 09:50:32.093730 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h6v96/must-gather-w4w2j" event={"ID":"2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b","Type":"ContainerStarted","Data":"25c1c5ccddb1254a34031c6a9a8c8bf11f4ee3f350151e4c8a40210180bf2055"} Mar 19 09:50:32.095174 master-0 kubenswrapper[13205]: I0319 09:50:32.095141 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h6v96/must-gather-d24cd" event={"ID":"722f7ae2-0359-4df4-a8a0-f92023ead304","Type":"ContainerStarted","Data":"4e0e63abc8357262b7dac3586fdcad424bca62ec79d6fedbfd2293ccc449617f"} Mar 19 09:50:35.089990 master-0 kubenswrapper[13205]: I0319 09:50:35.089921 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h6v96/must-gather-d24cd" event={"ID":"722f7ae2-0359-4df4-a8a0-f92023ead304","Type":"ContainerStarted","Data":"e000a242aa7e344eb9f643fd4fd722efb5446b694f6b9911093939bdf718f20e"} Mar 19 09:50:35.089990 master-0 kubenswrapper[13205]: I0319 09:50:35.089973 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h6v96/must-gather-d24cd" event={"ID":"722f7ae2-0359-4df4-a8a0-f92023ead304","Type":"ContainerStarted","Data":"5c6a01ec20b889b82415ef27845030fc3d9b3b520fae4553627b7ac1f929408a"} Mar 19 09:50:35.106275 master-0 kubenswrapper[13205]: I0319 09:50:35.106214 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h6v96/must-gather-d24cd" podStartSLOduration=2.64926051 podStartE2EDuration="5.106195643s" podCreationTimestamp="2026-03-19 09:50:30 +0000 UTC" firstStartedPulling="2026-03-19 09:50:31.908041729 +0000 UTC m=+1617.240348617" lastFinishedPulling="2026-03-19 09:50:34.364976862 +0000 UTC m=+1619.697283750" observedRunningTime="2026-03-19 09:50:35.102086593 +0000 UTC m=+1620.434393481" watchObservedRunningTime="2026-03-19 09:50:35.106195643 +0000 UTC m=+1620.438502531" Mar 19 09:50:35.223949 master-0 kubenswrapper[13205]: E0319 09:50:35.223904 13205 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c609bb_136a_4ce2_b9e2_0a03e1a37a62.slice/crio-d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Error finding container d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9: Status 404 returned error can't find the container with id d390d793d78cb0c6b25d3fd46404c1707b8bf3ab69475ae942b9fc4559b304f9 Mar 19 09:50:37.310573 master-0 kubenswrapper[13205]: I0319 09:50:37.308241 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-7d58488df-sf92q_1fac5afc-b7d8-4cc5-9d18-898ed3125320/cluster-version-operator/0.log" Mar 19 09:50:39.910465 master-0 kubenswrapper[13205]: I0319 09:50:39.909377 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kssp8_1df0e178-2040-45f9-8189-5c1d4bca71bd/controller/0.log" Mar 19 09:50:39.928554 master-0 kubenswrapper[13205]: I0319 09:50:39.927610 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kssp8_1df0e178-2040-45f9-8189-5c1d4bca71bd/kube-rbac-proxy/0.log" Mar 19 09:50:39.949553 master-0 kubenswrapper[13205]: I0319 09:50:39.948357 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/controller/0.log" Mar 19 09:50:40.017550 master-0 kubenswrapper[13205]: I0319 09:50:40.017003 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/frr/0.log" Mar 19 09:50:40.026550 master-0 kubenswrapper[13205]: I0319 09:50:40.026085 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/reloader/0.log" Mar 19 09:50:40.030554 master-0 kubenswrapper[13205]: I0319 09:50:40.030103 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/frr-metrics/0.log" Mar 19 09:50:40.045119 master-0 kubenswrapper[13205]: I0319 09:50:40.043437 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/kube-rbac-proxy/0.log" Mar 19 09:50:40.062467 master-0 kubenswrapper[13205]: I0319 09:50:40.059153 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/kube-rbac-proxy-frr/0.log" Mar 19 09:50:40.064960 master-0 kubenswrapper[13205]: I0319 09:50:40.064796 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/cp-frr-files/0.log" Mar 19 09:50:40.086547 master-0 kubenswrapper[13205]: I0319 09:50:40.085004 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-k4lpt_23525aee-4327-4f3f-a471-501ab5740c98/nmstate-console-plugin/0.log" Mar 19 09:50:40.094544 master-0 kubenswrapper[13205]: I0319 09:50:40.089414 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/cp-reloader/0.log" Mar 19 09:50:40.103692 master-0 kubenswrapper[13205]: I0319 09:50:40.101747 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/cp-metrics/0.log" Mar 19 09:50:40.131653 master-0 kubenswrapper[13205]: I0319 09:50:40.118097 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-qvpr9_d081bb39-3cce-4a28-b53c-41414b48a4be/frr-k8s-webhook-server/0.log" Mar 19 09:50:40.131653 master-0 kubenswrapper[13205]: I0319 09:50:40.118602 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7bhhs_eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7/nmstate-handler/0.log" Mar 19 09:50:40.150949 master-0 kubenswrapper[13205]: I0319 09:50:40.150881 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-mxtkr_1d13de7b-9da4-4240-a24d-d85ff82b405e/nmstate-metrics/0.log" Mar 19 09:50:40.164045 master-0 kubenswrapper[13205]: I0319 09:50:40.163986 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-54b99f6f6b-hk7ql_b3b14285-61c6-4760-85bc-64667c85f8af/manager/0.log" Mar 19 09:50:40.180562 master-0 kubenswrapper[13205]: I0319 09:50:40.179435 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-69bcd667c-x84zz_47802e4a-72b0-4595-a33e-ca548f695f60/webhook-server/0.log" Mar 19 09:50:40.186422 master-0 kubenswrapper[13205]: I0319 09:50:40.186356 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-mxtkr_1d13de7b-9da4-4240-a24d-d85ff82b405e/kube-rbac-proxy/0.log" Mar 19 09:50:40.208633 master-0 kubenswrapper[13205]: I0319 09:50:40.206228 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-95c7k_6ab170f8-577b-4f2c-a8f3-8fe1a5e45274/nmstate-operator/0.log" Mar 19 09:50:40.225880 master-0 kubenswrapper[13205]: I0319 09:50:40.225817 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-d44x4_21ed9745-5a64-43ba-94be-b27034f5de86/nmstate-webhook/0.log" Mar 19 09:50:40.266836 master-0 kubenswrapper[13205]: I0319 09:50:40.260481 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dvddw_3391abc7-00e2-4a16-95b0-4961dabda05b/speaker/0.log" Mar 19 09:50:40.277580 master-0 kubenswrapper[13205]: I0319 09:50:40.275325 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dvddw_3391abc7-00e2-4a16-95b0-4961dabda05b/kube-rbac-proxy/0.log" Mar 19 09:50:42.646801 master-0 kubenswrapper[13205]: I0319 09:50:42.646122 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcdctl/0.log" Mar 19 09:50:42.793580 master-0 kubenswrapper[13205]: I0319 09:50:42.792030 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd/0.log" Mar 19 09:50:42.815629 master-0 kubenswrapper[13205]: I0319 09:50:42.814592 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-metrics/0.log" Mar 19 09:50:42.830274 master-0 kubenswrapper[13205]: I0319 09:50:42.830238 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-readyz/0.log" Mar 19 09:50:42.855148 master-0 kubenswrapper[13205]: I0319 09:50:42.855104 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-rev/0.log" Mar 19 09:50:42.884575 master-0 kubenswrapper[13205]: I0319 09:50:42.884513 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/setup/0.log" Mar 19 09:50:42.904109 master-0 kubenswrapper[13205]: I0319 09:50:42.904032 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-ensure-env-vars/0.log" Mar 19 09:50:42.945204 master-0 kubenswrapper[13205]: I0319 09:50:42.944850 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-resources-copy/0.log" Mar 19 09:50:43.021615 master-0 kubenswrapper[13205]: I0319 09:50:43.021460 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a/installer/0.log" Mar 19 09:50:43.082245 master-0 kubenswrapper[13205]: I0319 09:50:43.082212 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_b149c739-203d-4f5a-af11-dba6835ed71d/installer/0.log" Mar 19 09:50:43.410146 master-0 kubenswrapper[13205]: I0319 09:50:43.409740 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-55975d94bc-hbbc2_05446c48-303e-434f-9d9b-eec4f1f2b253/oauth-openshift/0.log" Mar 19 09:50:43.977884 master-0 kubenswrapper[13205]: I0319 09:50:43.977858 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/assisted-installer_assisted-installer-controller-kwrpk_84e1a860-b3b0-4f3e-ac3d-9f4e40429ae9/assisted-installer-controller/0.log" Mar 19 09:50:44.282831 master-0 kubenswrapper[13205]: I0319 09:50:44.282778 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-k4dfd_e7fae040-28fa-4d97-8482-fd0dd12cc921/authentication-operator/0.log" Mar 19 09:50:44.307239 master-0 kubenswrapper[13205]: I0319 09:50:44.307204 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-k4dfd_e7fae040-28fa-4d97-8482-fd0dd12cc921/authentication-operator/1.log" Mar 19 09:50:45.802581 master-0 kubenswrapper[13205]: I0319 09:50:45.802537 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7dcf5569b5-29cq2_0f7b58ba-ff67-416a-880a-b7e0f9a6e35f/router/0.log" Mar 19 09:50:46.263591 master-0 kubenswrapper[13205]: I0319 09:50:46.263408 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h6v96/must-gather-w4w2j" event={"ID":"2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b","Type":"ContainerStarted","Data":"5d3792c0c0ddb55839e65b8669e5c225feac083ff6c85b99699b4f07b3ca7a90"} Mar 19 09:50:46.263591 master-0 kubenswrapper[13205]: I0319 09:50:46.263458 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h6v96/must-gather-w4w2j" event={"ID":"2eba4a2a-4a9b-4f25-9e3d-01e9495caf3b","Type":"ContainerStarted","Data":"84db82477435ea0049245f3ac072ce4d4d6d901d08ca4fcf9ce3e2f65afbc596"} Mar 19 09:50:46.293219 master-0 kubenswrapper[13205]: I0319 09:50:46.293133 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h6v96/must-gather-w4w2j" podStartSLOduration=2.475084158 podStartE2EDuration="16.293109883s" podCreationTimestamp="2026-03-19 09:50:30 +0000 UTC" firstStartedPulling="2026-03-19 09:50:31.842270411 +0000 UTC m=+1617.174577309" lastFinishedPulling="2026-03-19 09:50:45.660296146 +0000 UTC m=+1630.992603034" observedRunningTime="2026-03-19 09:50:46.2888945 +0000 UTC m=+1631.621201398" watchObservedRunningTime="2026-03-19 09:50:46.293109883 +0000 UTC m=+1631.625416771" Mar 19 09:50:46.494853 master-0 kubenswrapper[13205]: I0319 09:50:46.494806 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-57c47bdf6-d9h47_5a51c701-7f2a-4332-a301-746e8a0eb475/oauth-apiserver/0.log" Mar 19 09:50:46.505595 master-0 kubenswrapper[13205]: I0319 09:50:46.505482 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-57c47bdf6-d9h47_5a51c701-7f2a-4332-a301-746e8a0eb475/fix-audit-permissions/0.log" Mar 19 09:50:47.190906 master-0 kubenswrapper[13205]: I0319 09:50:47.190855 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-rsnsn_6904be4c-4f5f-4176-8100-7b6955c6d8da/kube-rbac-proxy/0.log" Mar 19 09:50:47.208696 master-0 kubenswrapper[13205]: I0319 09:50:47.208632 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-rsnsn_6904be4c-4f5f-4176-8100-7b6955c6d8da/cluster-autoscaler-operator/0.log" Mar 19 09:50:47.235113 master-0 kubenswrapper[13205]: I0319 09:50:47.235065 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-rsnsn_6904be4c-4f5f-4176-8100-7b6955c6d8da/cluster-autoscaler-operator/1.log" Mar 19 09:50:47.262192 master-0 kubenswrapper[13205]: I0319 09:50:47.262140 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nf2m5_87b757ff-ca45-4dc7-b31f-ccca53cb2354/cluster-baremetal-operator/1.log" Mar 19 09:50:47.263037 master-0 kubenswrapper[13205]: I0319 09:50:47.262991 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nf2m5_87b757ff-ca45-4dc7-b31f-ccca53cb2354/cluster-baremetal-operator/0.log" Mar 19 09:50:47.278405 master-0 kubenswrapper[13205]: I0319 09:50:47.278334 13205 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh"] Mar 19 09:50:47.280349 master-0 kubenswrapper[13205]: I0319 09:50:47.280315 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.314988 master-0 kubenswrapper[13205]: I0319 09:50:47.289136 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh"] Mar 19 09:50:47.323037 master-0 kubenswrapper[13205]: I0319 09:50:47.322992 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nf2m5_87b757ff-ca45-4dc7-b31f-ccca53cb2354/baremetal-kube-rbac-proxy/0.log" Mar 19 09:50:47.326628 master-0 kubenswrapper[13205]: I0319 09:50:47.326567 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/154de4dd-725b-406a-9ada-1ecaa490b956-proc\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.326628 master-0 kubenswrapper[13205]: I0319 09:50:47.326628 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/154de4dd-725b-406a-9ada-1ecaa490b956-podres\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.326886 master-0 kubenswrapper[13205]: I0319 09:50:47.326677 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/154de4dd-725b-406a-9ada-1ecaa490b956-sys\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.326886 master-0 kubenswrapper[13205]: I0319 09:50:47.326810 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4zts\" (UniqueName: \"kubernetes.io/projected/154de4dd-725b-406a-9ada-1ecaa490b956-kube-api-access-z4zts\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.326976 master-0 kubenswrapper[13205]: I0319 09:50:47.326899 13205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/154de4dd-725b-406a-9ada-1ecaa490b956-lib-modules\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.354917 master-0 kubenswrapper[13205]: I0319 09:50:47.354851 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-v9n8l_9a0f93ac-a77b-488a-bcc4-a45702a9e32d/control-plane-machine-set-operator/1.log" Mar 19 09:50:47.355278 master-0 kubenswrapper[13205]: I0319 09:50:47.355245 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-v9n8l_9a0f93ac-a77b-488a-bcc4-a45702a9e32d/control-plane-machine-set-operator/0.log" Mar 19 09:50:47.378460 master-0 kubenswrapper[13205]: I0319 09:50:47.378379 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-nmvcv_c10d0e00-cf19-4067-b7bf-ff569f2f3d71/kube-rbac-proxy/0.log" Mar 19 09:50:47.402119 master-0 kubenswrapper[13205]: I0319 09:50:47.400581 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-nmvcv_c10d0e00-cf19-4067-b7bf-ff569f2f3d71/machine-api-operator/0.log" Mar 19 09:50:47.429414 master-0 kubenswrapper[13205]: I0319 09:50:47.429304 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/154de4dd-725b-406a-9ada-1ecaa490b956-proc\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.429414 master-0 kubenswrapper[13205]: I0319 09:50:47.429418 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/154de4dd-725b-406a-9ada-1ecaa490b956-podres\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.429756 master-0 kubenswrapper[13205]: I0319 09:50:47.429491 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/154de4dd-725b-406a-9ada-1ecaa490b956-sys\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.429756 master-0 kubenswrapper[13205]: I0319 09:50:47.429552 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4zts\" (UniqueName: \"kubernetes.io/projected/154de4dd-725b-406a-9ada-1ecaa490b956-kube-api-access-z4zts\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.429756 master-0 kubenswrapper[13205]: I0319 09:50:47.429658 13205 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/154de4dd-725b-406a-9ada-1ecaa490b956-lib-modules\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.430208 master-0 kubenswrapper[13205]: I0319 09:50:47.430160 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/154de4dd-725b-406a-9ada-1ecaa490b956-lib-modules\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.430278 master-0 kubenswrapper[13205]: I0319 09:50:47.430260 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/154de4dd-725b-406a-9ada-1ecaa490b956-proc\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.430410 master-0 kubenswrapper[13205]: I0319 09:50:47.430369 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/154de4dd-725b-406a-9ada-1ecaa490b956-podres\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.430470 master-0 kubenswrapper[13205]: I0319 09:50:47.430431 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/154de4dd-725b-406a-9ada-1ecaa490b956-sys\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.450885 master-0 kubenswrapper[13205]: I0319 09:50:47.450726 13205 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4zts\" (UniqueName: \"kubernetes.io/projected/154de4dd-725b-406a-9ada-1ecaa490b956-kube-api-access-z4zts\") pod \"perf-node-gather-daemonset-qqhdh\" (UID: \"154de4dd-725b-406a-9ada-1ecaa490b956\") " pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:47.633938 master-0 kubenswrapper[13205]: I0319 09:50:47.633883 13205 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:48.189925 master-0 kubenswrapper[13205]: I0319 09:50:48.189858 13205 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh"] Mar 19 09:50:48.278858 master-0 kubenswrapper[13205]: I0319 09:50:48.278801 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" event={"ID":"154de4dd-725b-406a-9ada-1ecaa490b956","Type":"ContainerStarted","Data":"4bec3b2603599a2f9a783c52773a7f883e83d765af2e6f307d07b06352032db1"} Mar 19 09:50:48.558134 master-0 kubenswrapper[13205]: I0319 09:50:48.558085 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-r2cs7_9e10cb6e-5703-4e4d-a82b-f6de34888b65/cluster-cloud-controller-manager/1.log" Mar 19 09:50:48.563620 master-0 kubenswrapper[13205]: I0319 09:50:48.563524 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-r2cs7_9e10cb6e-5703-4e4d-a82b-f6de34888b65/cluster-cloud-controller-manager/0.log" Mar 19 09:50:48.599039 master-0 kubenswrapper[13205]: I0319 09:50:48.598994 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-r2cs7_9e10cb6e-5703-4e4d-a82b-f6de34888b65/config-sync-controllers/1.log" Mar 19 09:50:48.607275 master-0 kubenswrapper[13205]: I0319 09:50:48.607027 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-r2cs7_9e10cb6e-5703-4e4d-a82b-f6de34888b65/config-sync-controllers/0.log" Mar 19 09:50:48.622429 master-0 kubenswrapper[13205]: I0319 09:50:48.622378 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-r2cs7_9e10cb6e-5703-4e4d-a82b-f6de34888b65/kube-rbac-proxy/0.log" Mar 19 09:50:49.289773 master-0 kubenswrapper[13205]: I0319 09:50:49.289704 13205 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" event={"ID":"154de4dd-725b-406a-9ada-1ecaa490b956","Type":"ContainerStarted","Data":"0f1a5b367850b5ee7e8ca769ca2065df0bc23ee5733bad850a79f8c2239b2366"} Mar 19 09:50:49.290706 master-0 kubenswrapper[13205]: I0319 09:50:49.289909 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:49.311419 master-0 kubenswrapper[13205]: I0319 09:50:49.311352 13205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" podStartSLOduration=2.311336215 podStartE2EDuration="2.311336215s" podCreationTimestamp="2026-03-19 09:50:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:49.310697859 +0000 UTC m=+1634.643004747" watchObservedRunningTime="2026-03-19 09:50:49.311336215 +0000 UTC m=+1634.643643103" Mar 19 09:50:50.054334 master-0 kubenswrapper[13205]: I0319 09:50:50.054287 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-744f9dbf77-zgn8x_6775d7ec-8114-4fc3-a23d-d5ac910f3285/kube-rbac-proxy/0.log" Mar 19 09:50:50.080404 master-0 kubenswrapper[13205]: I0319 09:50:50.080347 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-744f9dbf77-zgn8x_6775d7ec-8114-4fc3-a23d-d5ac910f3285/cloud-credential-operator/0.log" Mar 19 09:50:51.331153 master-0 kubenswrapper[13205]: I0319 09:50:51.331101 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-rfnfj_7c70267e-b555-4d56-92e4-f24b65b61283/openshift-config-operator/0.log" Mar 19 09:50:51.340410 master-0 kubenswrapper[13205]: I0319 09:50:51.340364 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-rfnfj_7c70267e-b555-4d56-92e4-f24b65b61283/openshift-api/0.log" Mar 19 09:50:52.062682 master-0 kubenswrapper[13205]: I0319 09:50:52.062611 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-xcb24_7d185b6f-c2ea-4570-a9a0-9b2562e0a2b0/console-operator/0.log" Mar 19 09:50:52.195407 master-0 kubenswrapper[13205]: I0319 09:50:52.195349 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kssp8_1df0e178-2040-45f9-8189-5c1d4bca71bd/controller/0.log" Mar 19 09:50:52.201509 master-0 kubenswrapper[13205]: I0319 09:50:52.201465 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kssp8_1df0e178-2040-45f9-8189-5c1d4bca71bd/kube-rbac-proxy/0.log" Mar 19 09:50:52.219687 master-0 kubenswrapper[13205]: I0319 09:50:52.219642 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/controller/0.log" Mar 19 09:50:52.265176 master-0 kubenswrapper[13205]: I0319 09:50:52.265123 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/frr/0.log" Mar 19 09:50:52.278495 master-0 kubenswrapper[13205]: I0319 09:50:52.278438 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/reloader/0.log" Mar 19 09:50:52.293228 master-0 kubenswrapper[13205]: I0319 09:50:52.293175 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/frr-metrics/0.log" Mar 19 09:50:52.301732 master-0 kubenswrapper[13205]: I0319 09:50:52.301670 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/kube-rbac-proxy/0.log" Mar 19 09:50:52.307928 master-0 kubenswrapper[13205]: I0319 09:50:52.307889 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/kube-rbac-proxy-frr/0.log" Mar 19 09:50:52.316234 master-0 kubenswrapper[13205]: I0319 09:50:52.316131 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/cp-frr-files/0.log" Mar 19 09:50:52.325360 master-0 kubenswrapper[13205]: I0319 09:50:52.325308 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/cp-reloader/0.log" Mar 19 09:50:52.338349 master-0 kubenswrapper[13205]: I0319 09:50:52.338306 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/cp-metrics/0.log" Mar 19 09:50:52.352876 master-0 kubenswrapper[13205]: I0319 09:50:52.352826 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-qvpr9_d081bb39-3cce-4a28-b53c-41414b48a4be/frr-k8s-webhook-server/0.log" Mar 19 09:50:52.381318 master-0 kubenswrapper[13205]: I0319 09:50:52.381269 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-54b99f6f6b-hk7ql_b3b14285-61c6-4760-85bc-64667c85f8af/manager/0.log" Mar 19 09:50:52.391889 master-0 kubenswrapper[13205]: I0319 09:50:52.391837 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-69bcd667c-x84zz_47802e4a-72b0-4595-a33e-ca548f695f60/webhook-server/0.log" Mar 19 09:50:52.474298 master-0 kubenswrapper[13205]: I0319 09:50:52.474178 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dvddw_3391abc7-00e2-4a16-95b0-4961dabda05b/speaker/0.log" Mar 19 09:50:52.481201 master-0 kubenswrapper[13205]: I0319 09:50:52.481137 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dvddw_3391abc7-00e2-4a16-95b0-4961dabda05b/kube-rbac-proxy/0.log" Mar 19 09:50:52.664225 master-0 kubenswrapper[13205]: I0319 09:50:52.664118 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5ccdf4d79d-4zsfz_dea63849-8bd2-479d-b5bc-a9eaadba35f3/console/0.log" Mar 19 09:50:52.690726 master-0 kubenswrapper[13205]: I0319 09:50:52.690680 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-66b8ffb895-hfq5p_437ab63c-8bc0-4761-81fd-0da0052a9628/download-server/0.log" Mar 19 09:50:53.461047 master-0 kubenswrapper[13205]: I0319 09:50:53.460994 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-7d87854d6-n7hxq_ff50023c-0f3f-4506-b26f-9872d0eec45e/cluster-storage-operator/0.log" Mar 19 09:50:53.475973 master-0 kubenswrapper[13205]: I0319 09:50:53.475907 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/3.log" Mar 19 09:50:53.476216 master-0 kubenswrapper[13205]: I0319 09:50:53.476131 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-v9s9c_dc65ec1f-b8fb-40d6-ac39-46b255a33221/snapshot-controller/4.log" Mar 19 09:50:53.514636 master-0 kubenswrapper[13205]: I0319 09:50:53.514581 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-5f5d689c6b-jv8lm_8e073eb4-67f2-4de7-8848-50da73079dbc/csi-snapshot-controller-operator/0.log" Mar 19 09:50:54.045230 master-0 kubenswrapper[13205]: I0319 09:50:54.045191 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-9c5679d8f-cbw4r_16d2930b-486b-492d-983e-c6702d8f53a7/dns-operator/0.log" Mar 19 09:50:54.055142 master-0 kubenswrapper[13205]: I0319 09:50:54.055092 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-9c5679d8f-cbw4r_16d2930b-486b-492d-983e-c6702d8f53a7/kube-rbac-proxy/0.log" Mar 19 09:50:54.511166 master-0 kubenswrapper[13205]: I0319 09:50:54.511029 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6hs5j_e9e5abec-3ac3-4b91-b51d-74efffaff8f8/registry-server/0.log" Mar 19 09:50:54.516137 master-0 kubenswrapper[13205]: I0319 09:50:54.516092 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-79jrh_745093e5-ffe1-4443-b317-448948f3b311/dns/0.log" Mar 19 09:50:54.555406 master-0 kubenswrapper[13205]: I0319 09:50:54.555350 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-79jrh_745093e5-ffe1-4443-b317-448948f3b311/kube-rbac-proxy/0.log" Mar 19 09:50:54.579319 master-0 kubenswrapper[13205]: I0319 09:50:54.579273 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7llkw_89f3f27a-83eb-4cd9-b557-aeee15998793/dns-node-resolver/0.log" Mar 19 09:50:55.276151 master-0 kubenswrapper[13205]: I0319 09:50:55.276096 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-5bddk_a1098584-43b9-4f2c-83d2-22d95fb7b0c3/etcd-operator/1.log" Mar 19 09:50:55.294072 master-0 kubenswrapper[13205]: I0319 09:50:55.294023 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-5bddk_a1098584-43b9-4f2c-83d2-22d95fb7b0c3/etcd-operator/0.log" Mar 19 09:50:55.752243 master-0 kubenswrapper[13205]: I0319 09:50:55.752115 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcdctl/0.log" Mar 19 09:50:55.900093 master-0 kubenswrapper[13205]: I0319 09:50:55.900005 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd/0.log" Mar 19 09:50:55.913629 master-0 kubenswrapper[13205]: I0319 09:50:55.913576 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-metrics/0.log" Mar 19 09:50:55.922664 master-0 kubenswrapper[13205]: I0319 09:50:55.922616 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-readyz/0.log" Mar 19 09:50:55.934008 master-0 kubenswrapper[13205]: I0319 09:50:55.933939 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-rev/0.log" Mar 19 09:50:55.950635 master-0 kubenswrapper[13205]: I0319 09:50:55.950167 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/setup/0.log" Mar 19 09:50:55.964904 master-0 kubenswrapper[13205]: I0319 09:50:55.964843 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-ensure-env-vars/0.log" Mar 19 09:50:55.976706 master-0 kubenswrapper[13205]: I0319 09:50:55.976657 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-resources-copy/0.log" Mar 19 09:50:56.015421 master-0 kubenswrapper[13205]: I0319 09:50:56.015366 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_c5e3b99a-24af-42a0-bf5f-d82b91ecbc6a/installer/0.log" Mar 19 09:50:56.060372 master-0 kubenswrapper[13205]: I0319 09:50:56.060291 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_b149c739-203d-4f5a-af11-dba6835ed71d/installer/0.log" Mar 19 09:50:56.744102 master-0 kubenswrapper[13205]: I0319 09:50:56.744039 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-5549dc66cb-5m8t6_c247d991-809e-46b6-9617-9b05007b7560/cluster-image-registry-operator/0.log" Mar 19 09:50:56.758196 master-0 kubenswrapper[13205]: I0319 09:50:56.758148 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cd9c5_9e6648a1-bdd4-4c53-921b-790e8308d8e3/node-ca/0.log" Mar 19 09:50:57.288834 master-0 kubenswrapper[13205]: I0319 09:50:57.288772 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-rvwfh_03d12dab-1215-4c1f-a9f5-27ea7174d308/ingress-operator/0.log" Mar 19 09:50:57.300385 master-0 kubenswrapper[13205]: I0319 09:50:57.300339 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-rvwfh_03d12dab-1215-4c1f-a9f5-27ea7174d308/kube-rbac-proxy/0.log" Mar 19 09:50:57.657199 master-0 kubenswrapper[13205]: I0319 09:50:57.657094 13205 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h6v96/perf-node-gather-daemonset-qqhdh" Mar 19 09:50:57.989411 master-0 kubenswrapper[13205]: I0319 09:50:57.989290 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bf4j2_dedbf9d8-ffec-4bcb-b175-20efd7b7366e/serve-healthcheck-canary/0.log" Mar 19 09:50:58.574031 master-0 kubenswrapper[13205]: I0319 09:50:58.573980 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-68bf6ff9d6-4qq6m_4869583f-43af-4ec9-8dea-1da1634816dc/insights-operator/0.log" Mar 19 09:51:00.102135 master-0 kubenswrapper[13205]: I0319 09:51:00.102078 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b6909618-647a-45d3-9027-3c3578992af1/alertmanager/0.log" Mar 19 09:51:00.115629 master-0 kubenswrapper[13205]: I0319 09:51:00.115493 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b6909618-647a-45d3-9027-3c3578992af1/config-reloader/0.log" Mar 19 09:51:00.128367 master-0 kubenswrapper[13205]: I0319 09:51:00.128247 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b6909618-647a-45d3-9027-3c3578992af1/kube-rbac-proxy-web/0.log" Mar 19 09:51:00.153987 master-0 kubenswrapper[13205]: I0319 09:51:00.153930 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b6909618-647a-45d3-9027-3c3578992af1/kube-rbac-proxy/0.log" Mar 19 09:51:00.168308 master-0 kubenswrapper[13205]: I0319 09:51:00.168258 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b6909618-647a-45d3-9027-3c3578992af1/kube-rbac-proxy-metric/0.log" Mar 19 09:51:00.182494 master-0 kubenswrapper[13205]: I0319 09:51:00.182414 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b6909618-647a-45d3-9027-3c3578992af1/prom-label-proxy/0.log" Mar 19 09:51:00.207867 master-0 kubenswrapper[13205]: I0319 09:51:00.207793 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b6909618-647a-45d3-9027-3c3578992af1/init-config-reloader/0.log" Mar 19 09:51:00.257468 master-0 kubenswrapper[13205]: I0319 09:51:00.257419 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-58845fbb57-rtzvj_8c8ee765-76b8-4cde-8acb-6e5edd1b8149/cluster-monitoring-operator/0.log" Mar 19 09:51:00.273065 master-0 kubenswrapper[13205]: I0319 09:51:00.273018 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7bbc969446-dtbjl_9427be32-8a99-4a07-aec9-5fe1ddcf1e2f/kube-state-metrics/0.log" Mar 19 09:51:00.288587 master-0 kubenswrapper[13205]: I0319 09:51:00.288517 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7bbc969446-dtbjl_9427be32-8a99-4a07-aec9-5fe1ddcf1e2f/kube-rbac-proxy-main/0.log" Mar 19 09:51:00.302214 master-0 kubenswrapper[13205]: I0319 09:51:00.302153 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7bbc969446-dtbjl_9427be32-8a99-4a07-aec9-5fe1ddcf1e2f/kube-rbac-proxy-self/0.log" Mar 19 09:51:00.322740 master-0 kubenswrapper[13205]: I0319 09:51:00.322692 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-fd9c84ccc-jlrzf_36a8dce8-1815-4676-ab46-2cce5bc21bfd/metrics-server/0.log" Mar 19 09:51:00.338914 master-0 kubenswrapper[13205]: I0319 09:51:00.338876 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-6785466799-nphk8_2d6d2016-5c9a-4772-b247-255563ba9fad/monitoring-plugin/0.log" Mar 19 09:51:00.375055 master-0 kubenswrapper[13205]: I0319 09:51:00.375007 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pfzk7_86f98011-564c-4f08-8b8e-9d0518b77945/node-exporter/0.log" Mar 19 09:51:00.390239 master-0 kubenswrapper[13205]: I0319 09:51:00.390116 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pfzk7_86f98011-564c-4f08-8b8e-9d0518b77945/kube-rbac-proxy/0.log" Mar 19 09:51:00.405358 master-0 kubenswrapper[13205]: I0319 09:51:00.405318 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pfzk7_86f98011-564c-4f08-8b8e-9d0518b77945/init-textfile/0.log" Mar 19 09:51:00.424738 master-0 kubenswrapper[13205]: I0319 09:51:00.424684 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5dc6c74576-6kvc4_46a87945-656e-4154-9235-644a90bffe83/kube-rbac-proxy-main/0.log" Mar 19 09:51:00.443440 master-0 kubenswrapper[13205]: I0319 09:51:00.443388 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5dc6c74576-6kvc4_46a87945-656e-4154-9235-644a90bffe83/kube-rbac-proxy-self/0.log" Mar 19 09:51:00.464570 master-0 kubenswrapper[13205]: I0319 09:51:00.464515 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5dc6c74576-6kvc4_46a87945-656e-4154-9235-644a90bffe83/openshift-state-metrics/0.log" Mar 19 09:51:00.509959 master-0 kubenswrapper[13205]: I0319 09:51:00.509877 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07b2b70-f2f5-4575-add3-5fde88fc4848/prometheus/0.log" Mar 19 09:51:00.521513 master-0 kubenswrapper[13205]: I0319 09:51:00.521474 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07b2b70-f2f5-4575-add3-5fde88fc4848/config-reloader/0.log" Mar 19 09:51:00.534941 master-0 kubenswrapper[13205]: I0319 09:51:00.534879 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07b2b70-f2f5-4575-add3-5fde88fc4848/thanos-sidecar/0.log" Mar 19 09:51:00.548020 master-0 kubenswrapper[13205]: I0319 09:51:00.547970 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07b2b70-f2f5-4575-add3-5fde88fc4848/kube-rbac-proxy-web/0.log" Mar 19 09:51:00.561788 master-0 kubenswrapper[13205]: I0319 09:51:00.561745 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-rsnsn_6904be4c-4f5f-4176-8100-7b6955c6d8da/kube-rbac-proxy/0.log" Mar 19 09:51:00.566366 master-0 kubenswrapper[13205]: I0319 09:51:00.566332 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07b2b70-f2f5-4575-add3-5fde88fc4848/kube-rbac-proxy/0.log" Mar 19 09:51:00.569139 master-0 kubenswrapper[13205]: I0319 09:51:00.569085 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-rsnsn_6904be4c-4f5f-4176-8100-7b6955c6d8da/cluster-autoscaler-operator/0.log" Mar 19 09:51:00.591515 master-0 kubenswrapper[13205]: I0319 09:51:00.591469 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07b2b70-f2f5-4575-add3-5fde88fc4848/kube-rbac-proxy-thanos/0.log" Mar 19 09:51:00.592552 master-0 kubenswrapper[13205]: I0319 09:51:00.592499 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-rsnsn_6904be4c-4f5f-4176-8100-7b6955c6d8da/cluster-autoscaler-operator/1.log" Mar 19 09:51:00.601104 master-0 kubenswrapper[13205]: I0319 09:51:00.601053 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nf2m5_87b757ff-ca45-4dc7-b31f-ccca53cb2354/cluster-baremetal-operator/1.log" Mar 19 09:51:00.602996 master-0 kubenswrapper[13205]: I0319 09:51:00.602955 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nf2m5_87b757ff-ca45-4dc7-b31f-ccca53cb2354/cluster-baremetal-operator/0.log" Mar 19 09:51:00.610290 master-0 kubenswrapper[13205]: I0319 09:51:00.610227 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nf2m5_87b757ff-ca45-4dc7-b31f-ccca53cb2354/baremetal-kube-rbac-proxy/0.log" Mar 19 09:51:00.615385 master-0 kubenswrapper[13205]: I0319 09:51:00.615336 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07b2b70-f2f5-4575-add3-5fde88fc4848/init-config-reloader/0.log" Mar 19 09:51:00.623910 master-0 kubenswrapper[13205]: I0319 09:51:00.623873 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-v9n8l_9a0f93ac-a77b-488a-bcc4-a45702a9e32d/control-plane-machine-set-operator/1.log" Mar 19 09:51:00.624133 master-0 kubenswrapper[13205]: I0319 09:51:00.624110 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-v9n8l_9a0f93ac-a77b-488a-bcc4-a45702a9e32d/control-plane-machine-set-operator/0.log" Mar 19 09:51:00.633299 master-0 kubenswrapper[13205]: I0319 09:51:00.633247 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6c8df6d4b-9lpld_793e12a6-caff-4738-96ad-da1377e09fe8/prometheus-operator/0.log" Mar 19 09:51:00.641991 master-0 kubenswrapper[13205]: I0319 09:51:00.641878 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-nmvcv_c10d0e00-cf19-4067-b7bf-ff569f2f3d71/kube-rbac-proxy/0.log" Mar 19 09:51:00.652198 master-0 kubenswrapper[13205]: I0319 09:51:00.652145 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6c8df6d4b-9lpld_793e12a6-caff-4738-96ad-da1377e09fe8/kube-rbac-proxy/0.log" Mar 19 09:51:00.662594 master-0 kubenswrapper[13205]: I0319 09:51:00.660621 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-nmvcv_c10d0e00-cf19-4067-b7bf-ff569f2f3d71/machine-api-operator/0.log" Mar 19 09:51:00.674018 master-0 kubenswrapper[13205]: I0319 09:51:00.673979 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-69c6b55594-7wt9n_b0bc69d1-1383-478f-9a9e-c23e88646056/prometheus-operator-admission-webhook/0.log" Mar 19 09:51:00.700543 master-0 kubenswrapper[13205]: I0319 09:51:00.700366 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f7f86499b-nc978_5c7e8229-0d0e-4de4-9636-30bb0de815d9/telemeter-client/1.log" Mar 19 09:51:00.700743 master-0 kubenswrapper[13205]: I0319 09:51:00.700598 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f7f86499b-nc978_5c7e8229-0d0e-4de4-9636-30bb0de815d9/telemeter-client/0.log" Mar 19 09:51:00.713481 master-0 kubenswrapper[13205]: I0319 09:51:00.713439 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f7f86499b-nc978_5c7e8229-0d0e-4de4-9636-30bb0de815d9/reload/0.log" Mar 19 09:51:00.732015 master-0 kubenswrapper[13205]: I0319 09:51:00.731960 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f7f86499b-nc978_5c7e8229-0d0e-4de4-9636-30bb0de815d9/kube-rbac-proxy/0.log" Mar 19 09:51:00.758910 master-0 kubenswrapper[13205]: I0319 09:51:00.758863 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-594b9755d9-zlw92_16a30c06-47fd-44c5-8a5f-91374c9fbcdc/thanos-query/0.log" Mar 19 09:51:00.771028 master-0 kubenswrapper[13205]: I0319 09:51:00.770985 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-594b9755d9-zlw92_16a30c06-47fd-44c5-8a5f-91374c9fbcdc/kube-rbac-proxy-web/0.log" Mar 19 09:51:00.782661 master-0 kubenswrapper[13205]: I0319 09:51:00.782619 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-594b9755d9-zlw92_16a30c06-47fd-44c5-8a5f-91374c9fbcdc/kube-rbac-proxy/0.log" Mar 19 09:51:00.795461 master-0 kubenswrapper[13205]: I0319 09:51:00.795401 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-594b9755d9-zlw92_16a30c06-47fd-44c5-8a5f-91374c9fbcdc/prom-label-proxy/0.log" Mar 19 09:51:00.807753 master-0 kubenswrapper[13205]: I0319 09:51:00.807695 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-594b9755d9-zlw92_16a30c06-47fd-44c5-8a5f-91374c9fbcdc/kube-rbac-proxy-rules/0.log" Mar 19 09:51:00.821657 master-0 kubenswrapper[13205]: I0319 09:51:00.821615 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-594b9755d9-zlw92_16a30c06-47fd-44c5-8a5f-91374c9fbcdc/kube-rbac-proxy-metrics/0.log" Mar 19 09:51:02.268088 master-0 kubenswrapper[13205]: I0319 09:51:02.268029 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kssp8_1df0e178-2040-45f9-8189-5c1d4bca71bd/controller/0.log" Mar 19 09:51:02.278082 master-0 kubenswrapper[13205]: I0319 09:51:02.278020 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kssp8_1df0e178-2040-45f9-8189-5c1d4bca71bd/kube-rbac-proxy/0.log" Mar 19 09:51:02.300595 master-0 kubenswrapper[13205]: I0319 09:51:02.300493 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/controller/0.log" Mar 19 09:51:02.353219 master-0 kubenswrapper[13205]: I0319 09:51:02.353162 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/frr/0.log" Mar 19 09:51:02.366818 master-0 kubenswrapper[13205]: I0319 09:51:02.366772 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/reloader/0.log" Mar 19 09:51:02.377919 master-0 kubenswrapper[13205]: I0319 09:51:02.377848 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/frr-metrics/0.log" Mar 19 09:51:02.390098 master-0 kubenswrapper[13205]: I0319 09:51:02.390050 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/kube-rbac-proxy/0.log" Mar 19 09:51:02.432118 master-0 kubenswrapper[13205]: I0319 09:51:02.432054 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/kube-rbac-proxy-frr/0.log" Mar 19 09:51:02.443182 master-0 kubenswrapper[13205]: I0319 09:51:02.443145 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/cp-frr-files/0.log" Mar 19 09:51:02.457352 master-0 kubenswrapper[13205]: I0319 09:51:02.457299 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/cp-reloader/0.log" Mar 19 09:51:02.473372 master-0 kubenswrapper[13205]: I0319 09:51:02.473304 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-88xgg_d1114a57-d615-4763-928e-664cf7513b52/cp-metrics/0.log" Mar 19 09:51:02.490124 master-0 kubenswrapper[13205]: I0319 09:51:02.490077 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-qvpr9_d081bb39-3cce-4a28-b53c-41414b48a4be/frr-k8s-webhook-server/0.log" Mar 19 09:51:02.518984 master-0 kubenswrapper[13205]: I0319 09:51:02.518852 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-54b99f6f6b-hk7ql_b3b14285-61c6-4760-85bc-64667c85f8af/manager/0.log" Mar 19 09:51:02.534643 master-0 kubenswrapper[13205]: I0319 09:51:02.534590 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-69bcd667c-x84zz_47802e4a-72b0-4595-a33e-ca548f695f60/webhook-server/0.log" Mar 19 09:51:02.629598 master-0 kubenswrapper[13205]: I0319 09:51:02.629511 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dvddw_3391abc7-00e2-4a16-95b0-4961dabda05b/speaker/0.log" Mar 19 09:51:02.640063 master-0 kubenswrapper[13205]: I0319 09:51:02.639992 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dvddw_3391abc7-00e2-4a16-95b0-4961dabda05b/kube-rbac-proxy/0.log" Mar 19 09:51:03.831157 master-0 kubenswrapper[13205]: I0319 09:51:03.831115 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-598fbc5f8f-smksb_9076d131-644a-4332-8a70-34f6b0f71575/cluster-node-tuning-operator/1.log" Mar 19 09:51:03.832456 master-0 kubenswrapper[13205]: I0319 09:51:03.832420 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-598fbc5f8f-smksb_9076d131-644a-4332-8a70-34f6b0f71575/cluster-node-tuning-operator/0.log" Mar 19 09:51:03.853003 master-0 kubenswrapper[13205]: I0319 09:51:03.852947 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-rb955_fce9ea11-1498-4ef6-ba71-d125c193159c/tuned/0.log" Mar 19 09:51:04.528872 master-0 kubenswrapper[13205]: I0319 09:51:04.528839 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-cn2rm_4598a0ea-67ba-4982-bc67-e77c90508261/prometheus-operator/0.log" Mar 19 09:51:04.541546 master-0 kubenswrapper[13205]: I0319 09:51:04.541495 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-64887bc684-nrzl5_1966ff81-3727-4448-8c89-7c412a6d7df2/prometheus-operator-admission-webhook/0.log" Mar 19 09:51:04.557990 master-0 kubenswrapper[13205]: I0319 09:51:04.557948 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-64887bc684-vgrm2_00c14551-3cf9-4127-9412-3e40820820ea/prometheus-operator-admission-webhook/0.log" Mar 19 09:51:04.585200 master-0 kubenswrapper[13205]: I0319 09:51:04.585164 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-cgw5q_642e3b20-075e-4abe-9258-c47e385f1995/operator/0.log" Mar 19 09:51:04.603484 master-0 kubenswrapper[13205]: I0319 09:51:04.602733 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-d4f7cd8d5-2v5tt_1ba0a560-3156-446f-b451-47f129706196/perses-operator/0.log" Mar 19 09:51:06.128059 master-0 kubenswrapper[13205]: I0319 09:51:06.127911 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-8b68b9d9b-7qnf9_e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/kube-apiserver-operator/0.log" Mar 19 09:51:06.146361 master-0 kubenswrapper[13205]: I0319 09:51:06.146303 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-8b68b9d9b-7qnf9_e03f97d1-b6fe-4fc9-8cb5-c97af7a651bb/kube-apiserver-operator/1.log" Mar 19 09:51:06.566139 master-0 kubenswrapper[13205]: I0319 09:51:06.566089 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-65r6f_69bb71dd-c77e-44e6-8e15-bc8dc75f41ab/cert-manager-controller/0.log" Mar 19 09:51:06.576926 master-0 kubenswrapper[13205]: I0319 09:51:06.576873 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-8wzjf_5c953c26-428d-4eb2-b88a-d98a21bb27c2/cert-manager-cainjector/0.log" Mar 19 09:51:06.596582 master-0 kubenswrapper[13205]: I0319 09:51:06.595369 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-ntbld_cfba36a4-7184-4ff9-b46f-e196c7252fbf/cert-manager-webhook/0.log" Mar 19 09:51:06.878172 master-0 kubenswrapper[13205]: I0319 09:51:06.878060 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_434aabfa-50db-407e-92d3-a034696613e3/installer/0.log" Mar 19 09:51:06.905822 master-0 kubenswrapper[13205]: I0319 09:51:06.905768 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-retry-1-master-0_ff98fb1e-7a1f-4657-b085-743d6f2d28e2/installer/0.log" Mar 19 09:51:06.935569 master-0 kubenswrapper[13205]: I0319 09:51:06.935508 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_0f12c099-d9a7-48a9-9965-c339c4e32d31/installer/0.log" Mar 19 09:51:06.960550 master-0 kubenswrapper[13205]: I0319 09:51:06.960385 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-6-master-0_bc2139b7-8af8-4294-aee2-3e7429d2b1fe/installer/0.log" Mar 19 09:51:06.984738 master-0 kubenswrapper[13205]: I0319 09:51:06.984689 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-7-master-0_52d83e52-4097-4c66-ad8b-bd524ff59c95/installer/0.log" Mar 19 09:51:07.120914 master-0 kubenswrapper[13205]: I0319 09:51:07.120861 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_3cae843f2a8e3c3c3212b1177305c1d5/kube-apiserver/0.log" Mar 19 09:51:07.133615 master-0 kubenswrapper[13205]: I0319 09:51:07.133448 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_3cae843f2a8e3c3c3212b1177305c1d5/kube-apiserver-cert-syncer/0.log" Mar 19 09:51:07.148293 master-0 kubenswrapper[13205]: I0319 09:51:07.148247 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_3cae843f2a8e3c3c3212b1177305c1d5/kube-apiserver-cert-regeneration-controller/0.log" Mar 19 09:51:07.156858 master-0 kubenswrapper[13205]: I0319 09:51:07.156810 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_3cae843f2a8e3c3c3212b1177305c1d5/kube-apiserver-insecure-readyz/0.log" Mar 19 09:51:07.175964 master-0 kubenswrapper[13205]: I0319 09:51:07.175917 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_3cae843f2a8e3c3c3212b1177305c1d5/kube-apiserver-check-endpoints/0.log" Mar 19 09:51:07.194830 master-0 kubenswrapper[13205]: I0319 09:51:07.194788 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_3cae843f2a8e3c3c3212b1177305c1d5/setup/0.log" Mar 19 09:51:07.992515 master-0 kubenswrapper[13205]: I0319 09:51:07.991095 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-r28hm_3374940a-612d-4335-8236-3ffe8d6e73a5/kube-rbac-proxy/0.log" Mar 19 09:51:08.009799 master-0 kubenswrapper[13205]: I0319 09:51:08.009749 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-r28hm_3374940a-612d-4335-8236-3ffe8d6e73a5/manager/1.log" Mar 19 09:51:08.165400 master-0 kubenswrapper[13205]: I0319 09:51:08.165329 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-r28hm_3374940a-612d-4335-8236-3ffe8d6e73a5/manager/0.log" Mar 19 09:51:08.596426 master-0 kubenswrapper[13205]: I0319 09:51:08.596394 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-65r6f_69bb71dd-c77e-44e6-8e15-bc8dc75f41ab/cert-manager-controller/0.log" Mar 19 09:51:08.610816 master-0 kubenswrapper[13205]: I0319 09:51:08.610783 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-8wzjf_5c953c26-428d-4eb2-b88a-d98a21bb27c2/cert-manager-cainjector/0.log" Mar 19 09:51:08.625614 master-0 kubenswrapper[13205]: I0319 09:51:08.625583 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-ntbld_cfba36a4-7184-4ff9-b46f-e196c7252fbf/cert-manager-webhook/0.log" Mar 19 09:51:09.110636 master-0 kubenswrapper[13205]: I0319 09:51:09.110582 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-k4lpt_23525aee-4327-4f3f-a471-501ab5740c98/nmstate-console-plugin/0.log" Mar 19 09:51:09.129361 master-0 kubenswrapper[13205]: I0319 09:51:09.129328 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7bhhs_eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7/nmstate-handler/0.log" Mar 19 09:51:09.145098 master-0 kubenswrapper[13205]: I0319 09:51:09.145070 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-mxtkr_1d13de7b-9da4-4240-a24d-d85ff82b405e/nmstate-metrics/0.log" Mar 19 09:51:09.157719 master-0 kubenswrapper[13205]: I0319 09:51:09.157683 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-mxtkr_1d13de7b-9da4-4240-a24d-d85ff82b405e/kube-rbac-proxy/0.log" Mar 19 09:51:09.174481 master-0 kubenswrapper[13205]: I0319 09:51:09.174448 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-95c7k_6ab170f8-577b-4f2c-a8f3-8fe1a5e45274/nmstate-operator/0.log" Mar 19 09:51:09.194447 master-0 kubenswrapper[13205]: I0319 09:51:09.194388 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-d44x4_21ed9745-5a64-43ba-94be-b27034f5de86/nmstate-webhook/0.log" Mar 19 09:51:09.869687 master-0 kubenswrapper[13205]: I0319 09:51:09.869647 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/kube-multus-additional-cni-plugins/0.log" Mar 19 09:51:09.884824 master-0 kubenswrapper[13205]: I0319 09:51:09.884748 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/egress-router-binary-copy/0.log" Mar 19 09:51:09.898891 master-0 kubenswrapper[13205]: I0319 09:51:09.898857 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/cni-plugins/0.log" Mar 19 09:51:09.911392 master-0 kubenswrapper[13205]: I0319 09:51:09.911343 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/bond-cni-plugin/0.log" Mar 19 09:51:09.938212 master-0 kubenswrapper[13205]: I0319 09:51:09.938139 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/routeoverride-cni/0.log" Mar 19 09:51:09.952046 master-0 kubenswrapper[13205]: I0319 09:51:09.951990 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/whereabouts-cni-bincopy/0.log" Mar 19 09:51:09.964697 master-0 kubenswrapper[13205]: I0319 09:51:09.964648 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kv6s_979d4d12-a560-4309-a1d3-cbebe853e8ea/whereabouts-cni/0.log" Mar 19 09:51:09.979522 master-0 kubenswrapper[13205]: I0319 09:51:09.979472 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-64f78496dd-kwdfq_165e3498-b49e-42fa-a614-0680f8c93fc7/multus-admission-controller/0.log" Mar 19 09:51:09.993719 master-0 kubenswrapper[13205]: I0319 09:51:09.993649 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-64f78496dd-kwdfq_165e3498-b49e-42fa-a614-0680f8c93fc7/kube-rbac-proxy/0.log" Mar 19 09:51:10.013610 master-0 kubenswrapper[13205]: I0319 09:51:10.013541 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzdzd_157e3524-eb27-41ca-b49d-2697ee1245ca/kube-multus/0.log" Mar 19 09:51:10.076552 master-0 kubenswrapper[13205]: I0319 09:51:10.071692 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzdzd_157e3524-eb27-41ca-b49d-2697ee1245ca/kube-multus/1.log" Mar 19 09:51:10.105586 master-0 kubenswrapper[13205]: I0319 09:51:10.104955 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nq9vs_13072c08-c77c-4170-9ebe-98d63968747b/network-metrics-daemon/0.log" Mar 19 09:51:10.114318 master-0 kubenswrapper[13205]: I0319 09:51:10.114259 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nq9vs_13072c08-c77c-4170-9ebe-98d63968747b/kube-rbac-proxy/0.log" Mar 19 09:51:10.645587 master-0 kubenswrapper[13205]: I0319 09:51:10.645519 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_lvms-operator-6577d5757-2shjw_65beb37f-3f54-4de2-84af-04c9d50784f9/manager/0.log" Mar 19 09:51:10.670669 master-0 kubenswrapper[13205]: I0319 09:51:10.670620 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-pclnw_ab2297f7-d3ff-4033-b3c1-fa30756a6e9a/vg-manager/1.log" Mar 19 09:51:10.672910 master-0 kubenswrapper[13205]: I0319 09:51:10.672874 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-pclnw_ab2297f7-d3ff-4033-b3c1-fa30756a6e9a/vg-manager/0.log" Mar 19 09:51:11.206958 master-0 kubenswrapper[13205]: I0319 09:51:11.206888 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_e34f1e7e-9148-4fa7-8e5b-6b77ff2c62f4/installer/0.log" Mar 19 09:51:11.249650 master-0 kubenswrapper[13205]: I0319 09:51:11.249594 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-5-master-0_042705f9-eeff-4d51-808d-6da4be0720d3/installer/0.log" Mar 19 09:51:11.267189 master-0 kubenswrapper[13205]: I0319 09:51:11.267125 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-6-master-0_cc0cf46f-a311-4083-9187-8fb45c1106dd/installer/0.log" Mar 19 09:51:11.291364 master-0 kubenswrapper[13205]: I0319 09:51:11.291307 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-6-retry-1-master-0_13594ce3-7087-4af3-85eb-6c50b9e2bfd2/installer/0.log" Mar 19 09:51:11.447963 master-0 kubenswrapper[13205]: I0319 09:51:11.447897 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a2686abafe708291dc60d5667195e21a/kube-controller-manager/0.log" Mar 19 09:51:11.505568 master-0 kubenswrapper[13205]: I0319 09:51:11.505331 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a2686abafe708291dc60d5667195e21a/cluster-policy-controller/0.log" Mar 19 09:51:11.517549 master-0 kubenswrapper[13205]: I0319 09:51:11.516990 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a2686abafe708291dc60d5667195e21a/kube-controller-manager-cert-syncer/0.log" Mar 19 09:51:11.526483 master-0 kubenswrapper[13205]: I0319 09:51:11.526435 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_a2686abafe708291dc60d5667195e21a/kube-controller-manager-recovery-controller/0.log" Mar 19 09:51:11.543159 master-0 kubenswrapper[13205]: I0319 09:51:11.543113 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_revision-pruner-6-master-0_1228a411-677c-4ba0-96bb-9c6825839313/pruner/0.log" Mar 19 09:51:12.215894 master-0 kubenswrapper[13205]: I0319 09:51:12.215826 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-k4lpt_23525aee-4327-4f3f-a471-501ab5740c98/nmstate-console-plugin/0.log" Mar 19 09:51:12.253910 master-0 kubenswrapper[13205]: I0319 09:51:12.253851 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7bhhs_eb5b47ab-f9ae-4fb1-9acc-7e9c9b2ec6d7/nmstate-handler/0.log" Mar 19 09:51:12.259305 master-0 kubenswrapper[13205]: I0319 09:51:12.259250 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-pvlq6_f0c75102-6790-4ed3-84da-61c3611186f8/kube-controller-manager-operator/0.log" Mar 19 09:51:12.269014 master-0 kubenswrapper[13205]: I0319 09:51:12.268960 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-pvlq6_f0c75102-6790-4ed3-84da-61c3611186f8/kube-controller-manager-operator/1.log" Mar 19 09:51:12.280575 master-0 kubenswrapper[13205]: I0319 09:51:12.280485 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-mxtkr_1d13de7b-9da4-4240-a24d-d85ff82b405e/nmstate-metrics/0.log" Mar 19 09:51:12.286876 master-0 kubenswrapper[13205]: I0319 09:51:12.286830 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-mxtkr_1d13de7b-9da4-4240-a24d-d85ff82b405e/kube-rbac-proxy/0.log" Mar 19 09:51:12.300705 master-0 kubenswrapper[13205]: I0319 09:51:12.300641 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-95c7k_6ab170f8-577b-4f2c-a8f3-8fe1a5e45274/nmstate-operator/0.log" Mar 19 09:51:12.313050 master-0 kubenswrapper[13205]: I0319 09:51:12.312984 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-d44x4_21ed9745-5a64-43ba-94be-b27034f5de86/nmstate-webhook/0.log" Mar 19 09:51:14.021426 master-0 kubenswrapper[13205]: I0319 09:51:14.021366 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_014ef8bd-b940-41e2-9239-c238afe6ebae/installer/0.log" Mar 19 09:51:14.040364 master-0 kubenswrapper[13205]: I0319 09:51:14.040318 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-6-master-0_d1c20f3b-cd10-4eac-88d8-70db61994bc2/installer/0.log" Mar 19 09:51:14.059496 master-0 kubenswrapper[13205]: I0319 09:51:14.059446 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-7-master-0_7e546c2e-a198-4141-a1de-518b8d71d107/installer/0.log" Mar 19 09:51:14.096343 master-0 kubenswrapper[13205]: I0319 09:51:14.096280 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_32c74216166e87f3b80af3f77a8bf69d/kube-scheduler/0.log" Mar 19 09:51:14.108608 master-0 kubenswrapper[13205]: I0319 09:51:14.108563 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_32c74216166e87f3b80af3f77a8bf69d/kube-scheduler-cert-syncer/0.log" Mar 19 09:51:14.132537 master-0 kubenswrapper[13205]: I0319 09:51:14.132405 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_32c74216166e87f3b80af3f77a8bf69d/kube-scheduler-recovery-controller/0.log" Mar 19 09:51:14.146134 master-0 kubenswrapper[13205]: I0319 09:51:14.146070 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_32c74216166e87f3b80af3f77a8bf69d/wait-for-host-port/0.log" Mar 19 09:51:14.166495 master-0 kubenswrapper[13205]: I0319 09:51:14.166452 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_revision-pruner-7-master-0_d26a49eb-85ac-47e1-b098-557f6e625958/pruner/0.log" Mar 19 09:51:14.730042 master-0 kubenswrapper[13205]: I0319 09:51:14.727697 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-dddff6458-zddz9_d664acc4-ec4f-4078-ae93-404a14ea18fc/kube-scheduler-operator-container/0.log" Mar 19 09:51:14.746447 master-0 kubenswrapper[13205]: I0319 09:51:14.746372 13205 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-dddff6458-zddz9_d664acc4-ec4f-4078-ae93-404a14ea18fc/kube-scheduler-operator-container/1.log"